Thursday, January 16, 2014


Who is thinking you? It may seem like a queer question given that most of us believe that we are the masters of our own brains and actions. But, neuroscience, the teachings of ancient wisdom traditions, and perspectives from shamanism would advise us otherwise. Scientific studies have demonstrated that there are actually three brains. The oldest, which we share with reptiles, is where most of actions come from. Ancient wisdom streams, especially those that include meditation, tie human thought, will forces, and our drives, in no small part, to sources beyond our own mental apparatus and gifts. 

Part of the answer may be that—not only do we share the most “primitive” part of our brains with our animal ancestors—but it appears from cognitiveapproaches to archeology (represented by the like James L. Pearson, Colin Renfrew, Ezra B.W. Zubrow, and Steven Mithin), that we haven’t changed that much as a species for the last 100,000 years. Anthropologist, Jared Diamond, author of “The World Until Yesterday”: What We Can Learn From Traditional Societies” says that one of the reasons we are interested in so-called “tribal”, traditional, non-Western societies or as he calls us WEIRD (Western Educated Industrial Rich Democratic), is that we are closer to them than we can imagine, even dressed up as we are in our business attire and wearable technology.

In fact, we often accept all too willingly the advantages of modern technology without a fuller understanding of what may be lost in the process. Diamond goes on to assert that social bonds are an advantage that traditional societies have which is being lost by technology-dominated cultures. In a previous post, social media was cited as creating a loss of intimacy that would make personal contact worth more in the future than gold. John A. Livingston, a professor of environmental studies, goes further to say that dependence on technology “has made us not merely the servants of our own technology, but one of its products.” Anyone who has not checked his or her social media profile settings on Facebook or their other favorite social network should be aware that if you’re opting in, you are the data. That said, there is data and then, there is data—as anyone who has meditated has experienced.

One of the first things that occur in meditative states is the flurry of incoming and ongoing thoughts. In Zen and other Eastern traditions, this has been called the “monkey mind” because thought seems to be produced randomly, sometimes in a ramble that appears as if out of nowhere without effort—like a brain swinging from treetop to treetop in the jungle of mind. Shamanic techniques as well as meditative ones represent time-tested technologies that help us attempt to saddle this runaway horse of thought, so that we, the riders, can better direct their course and as a result, our intention, and action. As mystic teacher, Gurdjieff, once prompted—we must ask ourselves: “Who is the horse and who is the rider?” Unfortunately, for anyone who has tried, it doesn’t come easily. One reason we’ve now learned from neuroscience is that most decision-making—estimated at an incredible 95%—occurs below the surface of conscious thought.

Neuromarketing, a relatively recent field that is an offshoot of applied “neuroscience meets sales”, points to the target being the oldest vestige of our connection with all animals that have a spine—the so-called “reptilian” brain that sits atop the brainstem. This seat of the classic “fight or flight response,” is a kind of binary switch, which identifies whether outside stimuli are dangerous or safe. It also helps identify whether sensory input represents the possibilities of food or sex. A later development, the second part of the brain system is the mid brain, which controls emotional response. The latest addition, and the one which truly sets us apart from the animal pack, is the frontal cortex, which introduced into the system what might be called in the language of the HAL 9000 computer, our “logic center”. Among other things, it’s responsible for scenario planning, which differentiates us from other animals both in the hunt as well as availing itself to business strategy.

Graphic courtesy of

Here, science has mapped what ancient tradition has known for a very long time. Gurdjieff, who represented age-old wisdom streams from the Sufi and Tibetan to the Siberian and Ancient Egyptian, referred to humans as “three-brained beings” in his magnum opus, “All and Everything.” It leads one to believe that the ancients were onto something that might be put to good use today in the modern world and to elevate the business game. In fact, neuromarketing labs already exist and are using technology like fMRI scanners to locate specific brain functions for brands to use in marketing to us.  Companies like Coca Cola, Disney, and the CBS Television Network, have them and CBS’ is interestingly located close to the Las Vegas Strip where they recruit their subjects who are hopefully not too drunk a sample.

In their classic text on the subject, “Neuromarketing: Understanding the ‘Buy Button’ in Your Customer’s Brain”, Patrick Renvoise and Christophe Morin provide a helpful analogy as to how these three brains all fit together: You’re out hiking and come across something on the ground that looks like a snake. Your reptilian brain immediately kicks into gear and you jump back. Your logic center takes a closer look from afar and sees that it’s not a snake at all, but a broken stick that isn’t moving. The midbrain then directs you to wipe your brow, “Phew, that ‘snake’ was only a stick. Man, am I lucky!”

What does this mean specifically for marketing and sales? Quite simply, that in order to sell, you need to appeal to what Renvoise and Morin call the “Salesbrain.” In other words, you need to appeal to the “reptilian” part of the brain and make it easy for your customer to say “yes.” There are many sales books about “getting to ‘yes’”, but not as many about the power of its opposite. Again, an ancient practice from India that goes back at least 8000 years—the art of chanting mantras or power words in meditation—links the idea of belief to sound.

Dr. Andrew Newberg and Mark Robert Waldman in “Words Can Change Your Brain”, point out that the most dangerous we use is the word “no”, which literally causes the brain to go into stress mode when it is heard. So, Norman Vincent Peale had a point when he dedicated his classic volume to “The Power of Positive Thinking.” Newberg and Waldman founded the field of Compassionate Communication techniques for leadership based on their findings in neurology and offer practical techniques for using language to rewire the brain and more effectively communicate.

In “Rogue Primate”, John A. Livingston, goes further to suggest that we invent technology to get us closer to those things we have lost during the course of becoming civilized, specifically the world of nature. Gretel Erlich, who has dedicated her life to writing poetically about the natural world says, “Nature is the only true artist, and we are its apprentices.”

Yossi Ghinsberg’s account of his “apprenticeship” at the hand of the jungle, itself, and survival after being lost for weeks in the Amazonian rainforest provides even more foundation for the wisdom in having access to our “tribal mind”. At least, he suggests, we should sense our inner GPS based on knowing where we are and not just in relation to maps—which may be as erroneous as Apple’s attempt at navigation—but with respect to natural celestial events and nature’s compass. He codified the principles which helped him survive in “Laws of the Jungle: Jaguars Don’t Need Self-help Books—Profound Lessons Inspired by an Extraordinary Story of Survival”.

In a previous post, “The Cave Mind Operating System”, I drew some conclusions and hopefully offered a few practical lessons drawn from the study of prehistoric rock art, its discovery in modern times, and contemporary theories relating paintings that can be as old as 30,000 years ago to shamanism and hunting magic.

What other lessons can we learn from studies of cognitive archeology, neuromarketing, and behavioral science?


“Creating ways to keep us connected is…the central problem of mammalian evolution,” according to neuroscientist, Dr. Mathew D. Lieberman. The ability of the reptilian brain to determine threat is based on our being social animals. But, social media doesn’t always do the job of keeping us connected; despite how many “connections” you might have on LinkedIn, Facebook “friends” or “followers” on Twitter. Sociological and anthropological studies of modern cultures and tribal societies have shown that the maximum number of acquaintances that it is possible for most normal brains to handle is 150, which is also the average maximum size of a tribal unit.

“Face Time” is not just a Mac application. Social media and technology have made in-person meetings more essential than ever and the human face is definitely our social mask for those who have the ability to read it. Any good salesman will also tell you that one of the goals of a retail shopping experience is to get the product in the hands of a potential buyer. Possession is a psychological state and ownership often starts with the feeling of touching the merchandise. The bottom line is that social media aside, we still trust face-to-face interactions more than virtual ones in our decision making, whether we recognize it or not.

A Wall Street Journal study showed that from job interviews to dating, humans make a decision as to whether they like another person within a mere 20 seconds. Traditional business wisdom about smiling, firm handshakes, eye contact, and proper attire hold water for the subconscious mind. Developing a working knowledge of body language was popularized by negotiation experts like Gerard Nierenberg and Henry H. Calero in late 1960’s and early 1970’s and is still relevant. Emotions from friendliness to lying, flirtation to boredom, are written all over the face, gestures, postures, and other forms of physical expression that can be read like a book. It should be no surprise that Dan Zarella’s research of over a million Facebook interactions show that over 60% of what is shared on the social network are photos. His outstanding work is essential to understanding the science behind marketing.

Rudolf Steiner said that one could tell more about a man by the way that he walked, than by any other method. The lesson is that we should be wary that snap judgments are made both in person, and on a shared screen, and it’s easier to workaround an initially unfavorable impression in person than it is when the average amount of time that a user spends on a web page is 3-8 seconds and a face is blown up outsized in terms of its effect, the smaller the screen gets. Given that there are now over two billion mobile phones globally, the best practice is probably to use those initial seconds favorably to “get the meeting.”


Much has been written about the creative power of collaboration in this era of the “open organization” exemplified by the tech startup with its common spaces, non-hierarchal structures and relaxed management styles. It is also the foundation for the rule of democratic law in contrast with dispute resolution in tribal societies where individuals involved do not have representation, but usually know each other and resolve their differences face-to-face.

It may then, seem counterintuitive, but the so-called “wisdom of the crowd” isn’t always so wise at all times. Anyone who has run a focus group has seen the lemming effect that often takes place when an “alpha” participant becomes the effective group leader and sways opinion while silencing outliers who might otherwise speak differently.

Despite Seth Godin’s popularization of the word “tribes” to refer to market segments in the digital world, and the name of my blog, “Tribal Media”, sometimes we need to avoid the cliff of groupthink and say, “Take me to your leader”—which may, in fact, be you.


One of the challenges that social media and digital marketers often face is the question: “Where is the ROI.” Despite the many digital marketing companies promising “virality”, the first rule of thumb is that organic social marketing is usually slow growing, especially if it’s organic. Also, there is a difference between ROI and “reach”.

Even in the television business, it’s never been proven—except on infomercials—that commercial spots are responsible for unit sales. It’s still all about awareness.

So, when you see a commercial about a detergent that makes a family both clean and “happy” (as Don Draper would say), one is apt to remember the product when you are walking down the supermarket aisle and spotting the same colors and branding. The same is true for social media, except that the CPM or standard metric for success has yet to be agreed on.

While clicks, uniques, social reach, and other terms have been applied to mirror the conventions of offline mediametrics, "social" by nature is a totally different animal because it is ideally a circle of conversation that establishes value first prior to conversion to a sale.

A look at the chart above shows that Starbucks is sending out more "smoke signals" via Twitter than Coca Cola or Pepsi, and may be the reason why it represents a lifestyle brand as the beverage companies aspire to be. It doesn't take a brain scientist to see that Pepsi or Coke could up the ante by sending out more smoke signals or tweets to differentiate itself from its competitor.


Emoticons and social media network profiles can tell us a lot about their owners. Emoticons arose not only as shortcuts, but because images can convey emotion better than language. The rise and art of successful infographics demonstrates the need to present the noise of data in a highly visual, shrink-wrapped format. Brand logos are symbol systems that convey emotion, value, and engender aspiration as well as create lifestyle clans of belonging. The popularity of graffiti, with some of its practitioners considered artists in their own right, is another indication of the need humans have to make marks in time to define both establish identity and territory. Urban walls are a constant reminder of this common thread binding prehistoric and modern societies.

Text is a similarly abbreviated language with its acronyms and shortcuts. The desire to be in constant communication seems to be a mania that requires the ability to encompass as much information in as short bursts as possible. Where once the half hour sitcom and hour drama once ruled, short form video is now a parallel universe that contributes to the 8+ hours a day the average American spends on a screen of some kind.

But, texts and email fall short when compared with the lost art of letter writing and epic poetry and are often subject to having their meaning and/or tone misunderstood—and many times in unfavorable ways. Corporations are learning this the hard way and as a result of mis-steps in handling crisis communications, have enabled an expansive, digital reputation management industry.

The result is going to be that long form writing will stand out as a practice much in the same way that scribes were the only literate people in Ancient cultures in Egypt and Mesopotamia, where the original oral traditions predominated. If we do a deep dive into deep time, it’s easy to forget that writing as such was only invented approximately 5200 years ago. 

Slang wars and urban dictionaries aside, we’re going back to the future with the written word. The right word, spoken at the right time is still the most powerful medicine. Muscogee poet, Joy Harjo, wrote: “The sound of a voice will often reveal a map of destiny.” One only has to remember the wartime speeches of Winston Churchill, to know what she meant. Conversely, Marshall McLuhan pointed out that the rise of Hitler was due to his command of the radio broadcast as a propaganda tool and that had television been invented at the time, he would have been seen as the madman he was through the visual medium--and not risen to power.

On learning about the invention of the alphabet, Aristotle was purported to have said, “There goes the neighborhood!” referring to the inevitable loss of memory due to the faltering of oral tradition. What happens when memory becomes storage in “the cloud” and messages like Vine videos are 6 seconds long, Snapchats disappear in 10 seconds, and the average YouTube video is watched for 2:46?

McLuhan also reports that St. Augustine was the only person in medieval Europe who was capable of reading silently. When he recited texts from memory, he was regarded as a magician. Those who have a command of the written word will become a new future knowledge class and will have more power than those who send the thousands of messages that trap us every day.


If pictures are more effective than words to convey emotion, then music trumps them both. Why then would we ask: “Where can we find the silence?” A sage once said that before you speak, you should ask yourself, “Are my words going to improve the silence?” How do we find the place where creative energy comes from? With the proliferation of management books on innovation and creativity that emphasize systems and methodologies, it’s always helpful to ask a musician. No less an expert than Keith Richards cites that: “The silence is your canvas.” Famously, when he was asked for a definition of jazz, Dizzy Gillespie said, “It’s the silence between the notes.”

We exist in a media environment where the average American receives over 3000 messages a day. Increasingly, this makes the pursuit of silence next to impossible--and silence affords the moments where the creative mind can kick in and allow for intuition and ideas to flow uninterrupted. We also live in a time when the accidental nature of discovery and invention—in dreams, for example—is not the goal.

“In daily life, because triumph is made more visible than failure, you systematically overestimate your chances of succeeding,” observes Rolf Dobelli. He sees much thinking about personal and professional success suffering from what he calls “survival bias”. He reminds us that we should look at the nature of how we overestimate the chances of our achievements by “visiting the graves of once-promising projects, investments, and careers.” For every best-selling book, he points out that we should count the thousands of unpublished authors littering the publishing battlefield.

When creating, we need to always ask ourselves, “Are we ‘Man the Toolmaker’?” as anthropologists have characterized our species. Or are we more subject than ever to the idea that technology will save us time, and offer other efficiencies, which actually make us more “Tool the Man maker”? In many respects, we need look no further than nature or the world of ongoing creation, adaptation, and renewal for the answer.

Feedback is the way that nature learns, according to Tachi Kiuchi former Chairman and CEO Emeritus of Mitsubishi Electric America and Bill Shireman, in their 2002 book, “What We Learned In The Rainforest: Business Lessons from Nature.” Ethnologists are taught to listen rather than engage in conversation and one of the most effective words to use in marketing messages for social media engagement is “you.” The social web is dominated by self-interest and in order to survive, thrive, and generate virality, it is critical to serve the interest of the other first. The art of listening is connected to the discipline of finding that inner silence where one’s inner voice and agenda can be controlled in order to have a better understanding of most business situations.

According to science writer and filmmaker, Jonnie Hughes, “Highly social, brainy primates with time on their hands are able to watch the actions of others and copy them…” The ability to imitate, therefore, is hardwired—it’s actually called “the art of aping”, but it requires time—afforded when the mind is disciplined and calm with the open-ness to be able to mirror the other. Jeffrey Hazlitt, former CMO of Kodak, takes the notion a step further for organizations as a whole in his now classic management book, “The Mirror Test”.

The search for silence will become an expanding effort especially with generations that are input oriented rather than on output. At a conference on the future of television, singer/actor/activist, Ruben Blades once observed: “We will be the best informed generation to die of ignorance.” While meditation and shamanism offer a variety of techniques for the seeker of the kind of creative answers to be found in silence, we need to also be aware that no matter how proficient sentient or sentiment savvy search engines become at filtering data—or even the current trend of human curated search—there is a cost and ultimately, you are being searched as well.


“What can business learn from nature?” pose Kiuchi and Shireman. For starters, we have moved from an online world where destination sites once ruled and cost marketers dearly in terms of the budgets necessary to drive traffic and eyeballs to URLS, having now arrived at a web network, radial model. This model is actually a mirror of how things work in nature. “The global integration of networks creates a network ecology—literally, a place in which people can gather, conduct business, share ideas, and build relationships. People will be able to conduct their activities increasingly in the global network ecology—the Infosphere,” says Michael Vlahos.

Diagrams based on and courtesy of Kiuchi and Shireman

The irony is that limits are also a key positive force that force adaptation and innovation in the rainforest. The lesson is that one should create more than one consumes. Kiuchi and Shireman suggest that profitability is linked to companies that are disciplined enough to use limits to “force (and) channel action toward the creation of value. Today, the notion of a value chain should be updated to be more of the “closed value web loop” that drives natural cycles of the seasons in the forest and in life. The true business “plan” starts when we ask ourselves what we value most.

It may be helpful to keep in mind a perspective from the world's leading expert on ants, E.O. Wilson: "In a purely technical sense, each species of higher organism—beetle, moss, and so forth, is richer in information than a Caravaggio painting, Mozart symphony, or any other great work of art."


Thanks to original thinker, Nassim Nicholas Taleb, the term “black swan” is now well known in the world of finance. A trader by profession, he originated the phrase to describe unexpected events and the role of chance that more often than not are responsible for the behavior of the markets despite the best-intentioned, most persuasive predictions, analytics, and trading systems. In his collection of aphorisms, "The Bed of Procrustes", Taleb further states: "In science you need to understand the world; in business you need others to misunderstand it."

According to Jared Diamond, Harvard would have avoided the crash of its endowment and income during the 2008-9 worldwide financial meltdown if its “financial managers…followed the risk management strategy of peasant farmers, who maximize long-term time-averaged yields only insofar as that is compatible with maintaining yields above a certain critical level.” Tell that to former Fed Chairman Alan Greenspan.

In addition, Diamond points out that both traditional societies and Western cultures “have a tendency to resort to rituals in situations whose outcomes are hard to predict.” Whether it’s the superstitious behaviors of athletes, dowsers looking for water or the mad graphs of systems traders, the unexpected seems to be the only thing we can expect from a world designed for change. Being open to the opportunities potentially offered through chance and uncertainty in addition to being able to exercise our unique ability as thinking animals to create scenarios, seems to be the best workable, hybrid strategy.

“The fact is that we’re actually living permanently in the future and that’s what really worries me,” says Terry Gilliam, whose forthcoming movie, “The Zero Theorem”, is a dystopia about a coder drowning in data. Seems to be an emergent theme as a recent spate of movies may be trying to tell us. Whether or not you think that traditional societal knowledge or ancient wisdom is relevant, there’s a wave of films about survival—among them “Gravity”, “12 Years a Slave”, “American Hustle”, and “The Hobbit.” “Gravity” director, Alfonso Cuaron, pointed this recent theme out on a recent episode of the “Charlie Rose Show”. Despite the advantages of modern life, perhaps survival is more imperative than ever if it is being expressed in so many mass dreams.

The idea of connecting ancient traditions and mysticism with the business world may strike some as bad mojo or as controversial. Others like Jared Lanier, who coined the phrase “virtual reality”, have become Neo-Luddites and are warning us about the perils of accepting technology without knowing its potentially unfavorable consequences. We are not romanticizing tribal peoples who have their own problems and have all but succumbed to modern ways. Certain practices of extant traditional societies are brutal, especially with regard to women. But, Terence McKenna had a point when he spoke of our times as an “Archaic Revival” where, to discover the cultural riches of techniques that have worked well enough to survive thousands of years in our shared past, has the potential to save us from repeating the future.

I'll be speaking about "The Business Shaman" and the Cave Mind Operating System at The British Museum on January 28th.

Saturday, June 15, 2013


I've noticed at recent YouTube,  2nd, 3rd, and 4th screen panels, and other such confabs that established YouTube stars are often unseasoned in the kind of linear storytelling basics that have driven compelling content starting with Radio and Serials. While YouTube "views" may be a good start, it's also been interesting to note that many of these Internet "storytellers" are targeting traditional media distribution as the eventual outlet for their content and careers. Another motivator is suggested by a surge of content producer backlash to Google's revenue share of its nascent subscription channel partnerships. Now that the inmates are running the digital asylum and capitalizing on self-distribution platforms, many pundits and companies are betting that the web will produce talent for other more traditional media.

But, there is a reason why newcomers to programming like Netflix, XBox, and Amazon TV have all hired former entertainment executives to develop their new series entries. And there is probably a reason that they are defaulting to known stars and producers to deliver them. At the same time, the creative agencies in Hollywood have legions of so-called "twenty-somethings" diligently scouring the web for emergent talent to scoop up for representation. The entertainment business is hard to forecast--despite Google's recent announcement that search results can predict weekend box office. As one studio executive remarked, "Duh..." Still, the average YT video is 2:46 long, so short-form is definitely king in terms of the standard issue web viewer and fifteen-second rich media banners are another indicator.

How much narrative can fit in such a format and does it drive sequential viewing like traditional long form? A generational shift has certainly taken place with access to multiple screen entertainment and hopefully it will yield new storytellers and formats. But, web storytellers should look at securing a foundation in what has made stories relevant on an ongoing basis ever since the tribal fireside, rather than engaging only in producing the latest novelty video. We're hopefully moving now beyond the cat videos that dominated early YouTube successes and the opportunities in a multi-channel distribution universe offer these newcomers novel ways of stepping up to the plate and potentially reaching huge audiences. There are other challenges.

I recall a well attended confab a number of years ago at the home of Buzztone Marketing head, Liz Heller, where a web video producer exuberantly proclaimed that his video had received over fifty-thousand "hits". I resisted the temptation strenuously to ask whether the scenario would have changed, had he asked viewers to pay for the privilege.

Even so, the behavioral shift in attention span has yielded an almost cannibalistic appetite on the part of users to novelty, rather than to what used to be called by network program executives, "appointment viewing". Google's addition of watching metrics, speaks to this trend because it essentially measures whether viewers are sticking with a video all the way through as opposed to merely grazing and moving on to the next recommendation by the engine, itself, or from their friends. Among older viewers, binge viewing has also had an impact in the way that traditional television series are watched. Time Warner invested over $30 million last year in Maker Studios and perhaps it will pay off. It also begs the question: does taking this level of investment mean that Maker is not making any money? Predicting the future in Hollywood--exciting as these expansive platforms and distribution channels are--may be better left for the time being to Las Vegas bookies.

(With props to Bill Sarine from Beachglass Films for input and inspiration for this post)

Tuesday, July 24, 2012


The Web is old enough so that several generations have grown up with it at this point. For those of us who can recall that lonesome, crunchy ripple of dial-up, the wonder of Mosaic, the bustling of bulletin boards, and other artifacts of the early days of the World Wide Web, it may be hard to imagine the lives of young people who have never known what it is like to live without the Internet.

Its founding quietly took place back on October 29, 1969 with a message sent from UCLA to Stanford. According to Jonathan Zittrain, author of The Future of the Internet and How To Stop It, the purpose of building such a network at that time “was not to offer a particular set of information or services like news or weather to customers…but to connect anyone on the network to anyone else. It was up to the people connected to figure out why they wanted to be in touch in the first place; the network would simply carry data between the two points.” Without a large war chest, the academics and corporate researchers behind its original design couldn’t contemplate the creation of a global network. Now that this has occurred, it’s fascinating to see that its pervasive role in connecting people to brands and services is not that different from how it worked in its infancy.

The early days of its popular use ramped up back in 1995, when by the end of the year, it had grown to a robust 16 million users. The latest data shows that there are over two-and-a-quarter-billion global users today—or about a third of the world’s total population.

The impact of the mobile web, social media, and social TV that represent networks within a network have expanded its eco-system beyond the affinities, communities, and other nodes of specialized interest that grew out of the original capabilities of the Internet. The potential to knit together networks for brands and marketers presents a compelling, new non-linear business model.

It is so much a part of our lives that we sometimes forget that the Internet is a network of networks. Its use now extends across social networks, the mobile web, landing pages, and RSS broadcast (among others), and each new extension should remind us that the power of the Web lies not just in the common URL, but in the way that it radiates from site destination points to expansively "cast a net widely".

Network Conceptualization of Twitter -

Directing eyeballs to a single web site is an expensive and consuming venture. While there are certain goliath brands like Amazon and eBay that have established themselves as islands in the data stream, we are moving back to the future in a sense, from the destination orientation of the web to a radial network model. What exactly does this mean for marketing?

When a client comes in and characteristically asks for “a new web site” that is blue and has flowers, what they may be really asking for is a functional hub inside a multi-channel, branded eco-system. This a hub is at the center of a hybrid digital and traditional media wheel that has multiple spokes (or channels) which expand on a conversational journey outward, around, and looping back. If effective, these pathways have the eventual effect of creating a dynamic platform by linking to various targeted nodes of influence and driving users, consumers—an audience—back to the mother lode at the web site home. With consumers favoring multiple media channels, it is important that brands reach out to them wherever they are and on whatever medium they favor. We are no longer living in a world where the all-too familiar address of www. has a lemming effect.

Social media blackbelt and integrated marketing strategy Technoratisaurus Rex, Liz Gebhardt, formulated a depiction of the multi-channel universe in a blog post written over three years ago about product modeling, which I affectionately have called "The Gebhardt Brand Mandala" ever since:

At the core of any network is the hub that establishes its identity and center of gravity. Gebhardt’s model is more relevant than ever—especially as video platforms converge with traditional broadcast and short-form content. Even so, mainstream television and cable broadcast networks have for years now had to rely on watermarks that float on the lower third of the screen in a feeble attempt to claw at some remaining identification. This is not easy when TV brands are whipped up and feature ever redundant programming in the 500+ channel universe blender. With the addition of Internet and mobile television to the mix as well as behavioral trends like social TV and the rise of “binge viewing” (see my next blog post), the whole concept of the video “broadcast network” is being reinvented and up for grabs.

If YouTube is any indication, today’s average web user is videocentric, searching for constant novelty with a cannibalistic fervor, and beset by a short attention span—two-minutes and forty-three seconds to be exact. It’s also a short-form video world after all on the Internet, with 55% of bandwidth being hogged globally by some form of video—at any given time, it’s 30% BitTorrent with YouTube playing catch-up at 25%. What does this volume mean for brands, services or products in search of an audience?

Primarily, it means that if the common language of the world is now visual, then you would be wise to “speak” video to get your message across. Back in the day, Yahoo had what they called the “Three-Second Rule”, which admonished internal developers to deliver on value within that time-frame or risk losing the user forever. This was a far more dire prospect than the days when the TV remote control ruled and broadcasts had to engage viewers or else—another network was only a click away. Today, we are overwhelmed with the choices that clicks can offer us, and the order of the day is to grab attention through immediacy, interactivity, personalization, community, freemium, and other methods of creating value for the end user. All of these elements should inform how a brand platform or network is created.

The brand network model should be customized, but its essential look is planetary with marketing channels orbiting like satellites around the brand hub:

In taking a non-linear approach, we are only mimicking nature, where radial models dominate and networks abound. Chinese Taoist philosophers called the rivers and waterways of earth its circulatory system and the planetary water cycle bears this analogy out as science. Theodor Schwenk’s Sensitive Chaos is a classic study of patterns in water flow that can be seen as inspiring the classic phrase that “water seeks its own level”—an adage that is also an apt description of social network behavior.

The idea of reoccurring patterns in nature, their evolution in time and their configuring and reconfiguring to enhance movement is the basis of a field called constructal law. One of the designs in nature that it studies is branching. Lightning, water, cardiovascular systems—all have evolved treelike architecture because it is a efficient way of moving currents. Constructal law also extends to organization in business, politics, science and other fields where hierarchy and flow create patterns. 
The brain is famously a neural network and the web of life can be seen as a vast set of nodes that are self-organizing. Why should its mirror in the Internet be any different? And the creation of a brand platform network is an organic expression of its capabilities.

The history of successful networks and cable channels is that they have been branded by a defining show or personality, which encapsulates a broadcast entity’s mission and identity. Early television networks ported over successful radio personalities as a way of not only bringing along their loyal audiences, but as a way of defining their program offerings—whether it was comedy, drama, soaps, game shows, and other familiar formats. Though it started in 1948, the ABC network really established itself in the 1970s as differentiated from its predecessors, CBS, and NBC, with the introduction of innovations like “Monday Night Football” and the Sunday Night “Movie of the Week”.

Later in 1986, when Fox arrived, it distinguished itself as the “18-34” demographic network through younger skewing, slightly more risqué shows like “The Simpsons” and “Married With Children” that took familiar formats to new extremes. MTV was not on the map until it broadcast “Remote Control”, its own version of a game show that turned a storied format on its head and broadcast to both audience and advertisers that this was not your parent’s network. Successful branding by Emeril Lagasse established the Food Network with a larger-than-life, chef character.

The idea of personalities defining networks is as old as Ed Sullivan, Walter Cronkite, and Dan Rather branding CBS, Johnny Carson and Bill Cosby branding NBC, and Roseanne and Barbara Walters branding ABC. Audiences like to identify with personalities and characters. A nascent network program development strategy should be informed by an active search for talent and defining show concepts that can attract viewers, differentiate its value proposition, and compel advertisers to invest.

The emergence of watermarks on broadcast and cable television during the 90’s was no accident. In a cluttered landscape of multiple brands with often-redundant program offerings, it became an essential feature to help audiences know what entity they were actually watching. Whether a new network undertakes long or short-form programming, it needs to be packaged in a way that is clearly branded. This is particularly the case for any web-related or mobile video. Current effective tactics for syndicated and seeded web video content branding include end plate, white-wrap, and vanity URL techniques to ensure tracking of traffic directly related to video campaigns.

The diversity of media choices to connect brands with consumers has never been greater. The challenge presented by this opportunity is how to integrate the real with the digital world, and to make a strategic assessment of which channels represent the optimal means of reaching audiences wherever they are. The media and marketing networks of the future will be integrated. Brands will all be broadcasters.

While it is still early—especially with networks and agencies learning the hard way about paid blogging and in-your-face tweeting—organic growth and behavioral change will fuel the eventual integration of platforms. Movies did not replace radio and either did TV. The Internet did not kill television, as many web evangelists in its late 90's go-go years predicted. Social media will not replace prime time programming. On the contrary, several IPTV startups are looking at permanent integration of Twitter. Putting chips down on all of these new marketing channel markers is strategic, but allocation of resources and investment needs to be measured, given that ROI, SROI, and analytic models are still evolving. Traffic is still a key indicator, but is no longer enough of a metric; measuring influence is another step in the right direction.

Every new medium defines its own market at the same time that it forces extant media to redefine their own market share. Survival depends on companies and creative talent being able to recognize and optimize the unique value that differentiates one from another, and to provide the appropriate content accordingly. The marketing networks of the future will identify the best means to reach their intended consumers to create value and an optimal, personalized experience with their respective brands and content. They will also ensure that consumers are empowered with information and enabled through the multiple channels they provide to be active participants in evolving brands to be better. 

In the future, brands should be more persuaded by the words of poets, than by marketers. Or in the words of Emily Dickinson: 
                                                    Tell All The Truth
                                            Tell all the truth but tell it slant,
                                            Success in circuit lies,
                                            Too bright for our infirm delight
                                            The truth's superb surprise;

                                            As lightning to the children eased
                                            With explanation kind,
                                            The truth must dazzle gradually
                                            Or every man be blind.

We are all nodes on the network of all networks now. Or to let Shakespeare have the final word, "'Tis true, there is magic in the web of it." 

Saturday, March 26, 2011


A more recent take on Andy Warhol’s famous dictum puts us in a future where we will all have 15 friends. If you Google the word “social”, you get over 2 billion results. But, what is this “social” which we all take for granted and of which we all so readily speak? The word appears in history prior to the year 1387 as sociale borrowed apparently from the Latin via the Middle French. Routed from the Roman mother tongue, it originally meant “united or living with others” and “companion.”

Looking just one step further into the wilderness of word origins, we find its root in the Latin sequi which means “to follow.” So, here in a nutshell is where the Twitter transitive verb, “to follow”, finds its first use. If we search still further, we come upon its link to the Old Icelandic seggr meaning “companion or man” and ultimately, to the mother lode in Sanskrit where, as sakha, it simply means “friend.”

Here, we arrive at root origin of the Facebook transitive verb, “to friend”, closing the loop of a word that we use everyday to describe the expanding communication ripples that bind, link, and otherwise connect us at a click. Or to paraphrase what Terence, the Roman philosopher might tweet, “Nothing social is foreign to me.”

Friday, March 25, 2011


I kind of like to know what I’m talking about. At least, I like to know what the words I’m using mean even if I can’t make sense with what I am trying to make them say. We invariably use lots of words throughout our daily lives without reference to where they come from or how their original meaning has changed. Sometimes, we’re even distant from the slang that seems so current, but may be recycled. Words like “cool” have re-entered the lingo of new generations who don’t know that it came from the bebop beatniks, daddy-o. The first time I heard it since I’d first heard Kookie Burns say it on “77 Sunset Strip” was in Silicon Valley in the 90’s—and it came out of the mouths of some very geeky engineers. I still get a funny feeling when I hear Bill Gates use it in one of his testimonials.

There are other words that are also in common usage that are very distant from their origins—one in particular is almost as widespread as the word “like” and “awesome”. That word is “suck” and it’s been somewhat twisted not necessarily to mean something entirely new, but has found wide social acceptance despite its low origins.

When I was about 11, I bought some badges at the local hippie emporium. One of them said, “Dracula Sucks”, which my father made me take right off my Sgt. Pepper’s jacket and toss in the trash. I was surprised, and he answered what must have been my hangdog look by saying that it was “just inappropriate.” That was enough for me to spend the rest of the night seeking out its deep, dark, hidden meaning. I better understood when I discovered it referred to a sex act that my teachers probably would not see eye-to-eye with as a point of for extra class discussion.

But today, “suck” is so commonly used in commercials, on talk shows, by politicians, and in everyday conversation cross generations that it seems to have been denuded of its original meaning. It’s used to convey a general sense of something that is awful. Its reference to a subservient position for one participant in a sex act may be hidden in the mists of time—or at least in how well-worn it’s become as part of the daily lexicon used by schoolchildren and adults.

I wondered if its popular use might be excused somehow—maybe there was another meaning that forgave its vulgar origins. After some digging into a handy dictionary of etymology, I discovered that the word was part of a once popular phrase. It just so happened that “suck” also designated the sad lot of the runt of the porcine litter who was left to suckle on the hind most “teat”. Eventually, the expression gave way to “suck hind tit”, which is probably shrink-wrapped today in common usage as good old, plain, “it sucks!” So, the next time you are tempted to use it, remember that words are chameleons that double-up and sometimes come back to bite us like the time-tossed travelers they are.

(With thanks to Suw Charman Anderson and Peter Corbett:

Saturday, October 23, 2010


“Who is Don Draper?” is a question which is one of the central themes of the hit AMC series. A friend asked my opinion of this season’s finale and about how I thought the series would ultimately end. It got me to thinking about a lot more than Don’s dilemma and urge to confess. My confession is that I have a love/hate relationship with the show.

I grew up in the era that it depicts so well—what I hate is the mirror reflection of what I remember of that time as a child of divorce. But I what I love is the great dramatic craft and wonderful acting—even though, I did pick up one anachronism last year. In the offending episode, the ad guys celebrated on one occasion by breaking out a bottle of Stolichnaya—a gesture that certainly would not have gone over too well in the Cold War Era. I’ve subsequently seen bottles of Smirnoff in later episodes, which has righted the situation. That said, there is so much to admire about the art direction and attention to period detail that it almost makes you want to take off your fedora, take out a pack of Lucky Strikes, and reinstate the three-martini lunch ritual.

But getting back to my friend’s questions, I told her that I thought that the last scene of the finale was unnecessary. When I saw the masterful scene before it, I thought that it was a great open ending—after Don tells his ex-wife that he’s getting remarried, we see an empty bottle on the kitchen counter of their former family home, framed center stage like a dark, Courbet still life as the lights are shut off by the departing former couple in what I thought was a fade to black and end credits. The question becomes—is there a spark between them that will come between Don’s impending new marriage? We already should have reservations about the match knowing that he is an inveterate cad and his impulsive decision to marry his secretary does not auger well for faithfulness or longevity.

But the writers couldn’t resist the obvious and my enthusiasm was quickly dampened because it didn’t end there. They chose to continue, and cut to Don and his fiancée in bed with Sonny and Cher’s “I Got You Babe” playing in the background. Now, the inference here—and a rather heavy handed one I thought—whether it is subconscious or not—is that the duo singing on the soundtrack had a fairy book Hollywood marriage that ended in divorce—hence, the seemingly sentimental, hippie love song casts a foreboding shadow over the betrothed couple lounging in bed. This final scene seemed to lack the subtlety of the one beforehand—do we really have to spell everything out in TV America? The kitchen scene struck me as something you’d see in a foreign film. The bed scene, typical soap opera.

OK. I’ll stop producing the show. Let’s get back to Don. He’s got a secret that is now burning after the last season. The perfection of his character is written down to his Dickensian name—his “adopted” namesake of “Draper” lends itself to two meanings—his hidden identity is sequestered under the draping of what happened in the Korean War; the other, is that, in addition to hiding personal truths, he is the perfect Ad Man because he’s so adept at all the shadings of truth that his profession requires.

The well-spun phrase “Truth in Advertising” (actually a law) certainly has a different spin to it especially with the platform now offered by social media. Consumers can instantly flex their muscles and spark negative PR grassfires that can grow into the kind of outright conflagrations that have sometimes brought corporate giants to their knees—remember the Domino’s pizza disaster when several misbegotten pizza twirlers posted a video on YouTube showing them adorning their pies with toppings that were…shall we say, not on the menu, but definitely organic? There are countless other examples that have motivated most major corporations to preemptively hire legions of twenty-year-olds to maintain a vigilant watch in the blogosphere for negative consumer rumblings. Mad consumers can now be an activist virus.

Back in the 1960’s however, we were living the American Dream and drinking the Kool Aid that turned us into that consummate culture of consumption which has for the last several decades displaced our standing as a manufacturing country—the rest as they say, is subject of daily reports on the unemployment rate, foreclosures, and the general Fall of the Dow Jones and perhaps Empire. One of the many things that “Mad Men” gets so right is the way that we were sold and bought a bill of goods in the 50’s and 60’s from unfiltered cigarettes to the bomb shelters that we didn’t need. It all seemed so simple—merely “duck and cover” until the inconvenience of an atomic attack passed over and we could return to our regularly scheduled programming.

So, Don Draper is really the Beatles’ “Nowhere Man” come several years early and in some respects, he is a reflection of several generations who lived through the post World War II era. He’s caught in between the sheets as a relationship train wreck that doesn’t know who he is and between the 50’s and the 60’s that are starting to explode as we see in the season just concluded.

Don is a hybrid archetype of both T.S. Eliot’s “hollow men”, and the discontented businessman captured by Sloan Wilson’s, “The Man In The Grey Flannel Suit”, a 50’s best seller and hit movie with Gregory Peck and Jennifer Jones in 1956. Like this work, “Mad Men” has appropriately been celebrated as capturing the mid-century American “zeitgeist” just as “The Social Network” has been cited as doing the same for the Internet Era. The irony about Don’s secret is that he is about to enter a time period when his identity problem is
no longer be relevant.

Even though I enjoy the series, I hope that next season is the last. In America, the lifespan of a TV series is not motivated so much by its organic narrative shape and pulse, but by the economic imperative of reaching the magical goal of a syndicatable 65 episodes. This is the threshold for the number of episodes that can be “stripped” or be distributed as “repeats” five days a week for roughly half-a-year (26 weeks) before they have to recycle and repeat. One of the reasons that foreign, and in particular, British TV series seem to have an edge to them—take the seamlessness and naturally closed narrative arc of a classic series like “The Prisoner”, for example—is that rather than produce a show until it runs out of steam, foreign shows are written and produced as so-called “limited series”, a standard emulated well by some US cable network shows.

The end of “The Sopranos” was roundly criticized as a cop-out by many critics and fans, and is a prime example of how the American system is wanting at times. Sitcoms may be one thing to draw out as long as the stars stay the relative ages of their characters, but drama is better written as an entire story arc at the outset rather than running on tracks that have no final destination in sight—except having enough episodes to syndicate.

So, how should “Mad Men” end? Here’s my take: An energized client pitch is disrupted at the agency offices as a growing brouhaha emanates from the New York streets below. It's the sound of thousands of Peace Marchers parading to protest the Vietnam War and starting to fight with hooting construction workers. Maybe according to the series’ lifecycle, it’s a tad early for this and I’m committing my own anachronistic crime, but time lapse could help the series get through the inevitable relationship body counts which predictably lie ahead.

For all we know, Don may have already dropped acid in a future episode, thus confusing his identity issue even further like so many who psychedelicized. Now, with a burgeoning Peace Movement and Hippie Scene converging on our Nowhere Man, he is overcome by curiosity as everyone in the pitch meeting is drawn in astonishment to the high window to look out over the spectacle of history in the making. Impulsive to the end, he bails on the pitch and descends to the street. On the ground, he is caught up in the crowd, looking unsure of himself as his tie is loosened and jacket pawed by hippie chicks who welcome the “straight”. We last see him as he looks around in disbelief, not knowing whether to join “the parade” or run for his life. What he realizes quickly is that his desire to confess and his problems, in the immortal words spoken by Humphrey Bogart at the end of “Casablanca”, don’t “amount to a hill of beans” compared to the march of history. And the audience doesn't know either. End of story.

That way, Don Draper represents a whole generation of Mad Men, who like my father, were all so convinced that they were defined by their work. Blame the Cold War or Madison Avenue. Don is only special because he had to deploy a mask to cover up an identity issue that is no longer a big deal when assassinations, LSD, and Vietnam ripped open the facade of the mythic 50's/60's “Ozzie and Harriet”/I Like Ike/Apple Pie/Take A Letter/Zone, and everyone was revealed as not knowing who they were, where they belonged, and what tribe was right for them...Ultimately, Don can only find the redemption we all hope for him once the women finally take over, so maybe he gets hit on the head at the end with all the secretaries’ burning bras as they fall from high out of the agency office windows like a snowy ticker tape parade over Madison Avenue.

And now, for these messages from our discorporate sponsors…

(With thanks to Sarah Kelley)