Showing posts with label YouTube. Show all posts
Showing posts with label YouTube. Show all posts

Saturday, June 15, 2013

IT'S STILL ALL ABOUT STORYTELLING


I've noticed at recent YouTube,  2nd, 3rd, and 4th screen panels, and other such confabs that established YouTube stars are often unseasoned in the kind of linear storytelling basics that have driven compelling content starting with Radio and Serials. While YouTube "views" may be a good start, it's also been interesting to note that many of these Internet "storytellers" are targeting traditional media distribution as the eventual outlet for their content and careers. Another motivator is suggested by a surge of content producer backlash to Google's revenue share of its nascent subscription channel partnerships. Now that the inmates are running the digital asylum and capitalizing on self-distribution platforms, many pundits and companies are betting that the web will produce talent for other more traditional media.

But, there is a reason why newcomers to programming like Netflix, XBox, and Amazon TV have all hired former entertainment executives to develop their new series entries. And there is probably a reason that they are defaulting to known stars and producers to deliver them. At the same time, the creative agencies in Hollywood have legions of so-called "twenty-somethings" diligently scouring the web for emergent talent to scoop up for representation. The entertainment business is hard to forecast--despite Google's recent announcement that search results can predict weekend box office. As one studio executive remarked, "Duh..." Still, the average YT video is 2:46 long, so short-form is definitely king in terms of the standard issue web viewer and fifteen-second rich media banners are another indicator.

How much narrative can fit in such a format and does it drive sequential viewing like traditional long form? A generational shift has certainly taken place with access to multiple screen entertainment and hopefully it will yield new storytellers and formats. But, web storytellers should look at securing a foundation in what has made stories relevant on an ongoing basis ever since the tribal fireside, rather than engaging only in producing the latest novelty video. We're hopefully moving now beyond the cat videos that dominated early YouTube successes and the opportunities in a multi-channel distribution universe offer these newcomers novel ways of stepping up to the plate and potentially reaching huge audiences. There are other challenges.

I recall a well attended confab a number of years ago at the home of Buzztone Marketing head, Liz Heller, where a web video producer exuberantly proclaimed that his video had received over fifty-thousand "hits". I resisted the temptation strenuously to ask whether the scenario would have changed, had he asked viewers to pay for the privilege.

Even so, the behavioral shift in attention span has yielded an almost cannibalistic appetite on the part of users to novelty, rather than to what used to be called by network program executives, "appointment viewing". Google's addition of watching metrics, speaks to this trend because it essentially measures whether viewers are sticking with a video all the way through as opposed to merely grazing and moving on to the next recommendation by the engine, itself, or from their friends. Among older viewers, binge viewing has also had an impact in the way that traditional television series are watched. Time Warner invested over $30 million last year in Maker Studios and perhaps it will pay off. It also begs the question: does taking this level of investment mean that Maker is not making any money? Predicting the future in Hollywood--exciting as these expansive platforms and distribution channels are--may be better left for the time being to Las Vegas bookies.

(With props to Bill Sarine from Beachglass Films for input and inspiration for this post)

Tuesday, July 24, 2012

THE NETWORK OF ALL NETWORKS


 
The Web is old enough so that several generations have grown up with it at this point. For those of us who can recall that lonesome, crunchy ripple of dial-up, the wonder of Mosaic, the bustling of bulletin boards, and other artifacts of the early days of the World Wide Web, it may be hard to imagine the lives of young people who have never known what it is like to live without the Internet.

Its founding quietly took place back on October 29, 1969 with a message sent from UCLA to Stanford. According to Jonathan Zittrain, author of The Future of the Internet and How To Stop It, the purpose of building such a network at that time “was not to offer a particular set of information or services like news or weather to customers…but to connect anyone on the network to anyone else. It was up to the people connected to figure out why they wanted to be in touch in the first place; the network would simply carry data between the two points.” Without a large war chest, the academics and corporate researchers behind its original design couldn’t contemplate the creation of a global network. Now that this has occurred, it’s fascinating to see that its pervasive role in connecting people to brands and services is not that different from how it worked in its infancy.

The early days of its popular use ramped up back in 1995, when by the end of the year, it had grown to a robust 16 million users. The latest data shows that there are over two-and-a-quarter-billion global users today—or about a third of the world’s total population.

The impact of the mobile web, social media, and social TV that represent networks within a network have expanded its eco-system beyond the affinities, communities, and other nodes of specialized interest that grew out of the original capabilities of the Internet. The potential to knit together networks for brands and marketers presents a compelling, new non-linear business model.

It is so much a part of our lives that we sometimes forget that the Internet is a network of networks. Its use now extends across social networks, the mobile web, landing pages, and RSS broadcast (among others), and each new extension should remind us that the power of the Web lies not just in the common URL, but in the way that it radiates from site destination points to expansively "cast a net widely".




Network Conceptualization of Twitter - http://www.cytoscape.org

Directing eyeballs to a single web site is an expensive and consuming venture. While there are certain goliath brands like Amazon and eBay that have established themselves as islands in the data stream, we are moving back to the future in a sense, from the destination orientation of the web to a radial network model. What exactly does this mean for marketing?

When a client comes in and characteristically asks for “a new web site” that is blue and has flowers, what they may be really asking for is a functional hub inside a multi-channel, branded eco-system. This a hub is at the center of a hybrid digital and traditional media wheel that has multiple spokes (or channels) which expand on a conversational journey outward, around, and looping back. If effective, these pathways have the eventual effect of creating a dynamic platform by linking to various targeted nodes of influence and driving users, consumers—an audience—back to the mother lode at the web site home. With consumers favoring multiple media channels, it is important that brands reach out to them wherever they are and on whatever medium they favor. We are no longer living in a world where the all-too familiar address of www. has a lemming effect.

Social media blackbelt and integrated marketing strategy Technoratisaurus Rex, Liz Gebhardt, formulated a depiction of the multi-channel universe in a blog post written over three years ago about product modeling, which I affectionately have called "The Gebhardt Brand Mandala" ever since:


At the core of any network is the hub that establishes its identity and center of gravity. Gebhardt’s model is more relevant than ever—especially as video platforms converge with traditional broadcast and short-form content. Even so, mainstream television and cable broadcast networks have for years now had to rely on watermarks that float on the lower third of the screen in a feeble attempt to claw at some remaining identification. This is not easy when TV brands are whipped up and feature ever redundant programming in the 500+ channel universe blender. With the addition of Internet and mobile television to the mix as well as behavioral trends like social TV and the rise of “binge viewing” (see my next blog post), the whole concept of the video “broadcast network” is being reinvented and up for grabs.

If YouTube is any indication, today’s average web user is videocentric, searching for constant novelty with a cannibalistic fervor, and beset by a short attention span—two-minutes and forty-three seconds to be exact. It’s also a short-form video world after all on the Internet, with 55% of bandwidth being hogged globally by some form of video—at any given time, it’s 30% BitTorrent with YouTube playing catch-up at 25%. What does this volume mean for brands, services or products in search of an audience?

Primarily, it means that if the common language of the world is now visual, then you would be wise to “speak” video to get your message across. Back in the day, Yahoo had what they called the “Three-Second Rule”, which admonished internal developers to deliver on value within that time-frame or risk losing the user forever. This was a far more dire prospect than the days when the TV remote control ruled and broadcasts had to engage viewers or else—another network was only a click away. Today, we are overwhelmed with the choices that clicks can offer us, and the order of the day is to grab attention through immediacy, interactivity, personalization, community, freemium, and other methods of creating value for the end user. All of these elements should inform how a brand platform or network is created.

The brand network model should be customized, but its essential look is planetary with marketing channels orbiting like satellites around the brand hub:



In taking a non-linear approach, we are only mimicking nature, where radial models dominate and networks abound. Chinese Taoist philosophers called the rivers and waterways of earth its circulatory system and the planetary water cycle bears this analogy out as science. Theodor Schwenk’s Sensitive Chaos is a classic study of patterns in water flow that can be seen as inspiring the classic phrase that “water seeks its own level”—an adage that is also an apt description of social network behavior.

The idea of reoccurring patterns in nature, their evolution in time and their configuring and reconfiguring to enhance movement is the basis of a field called constructal law. One of the designs in nature that it studies is branching. Lightning, water, cardiovascular systems—all have evolved treelike architecture because it is a efficient way of moving currents. Constructal law also extends to organization in business, politics, science and other fields where hierarchy and flow create patterns. 
The brain is famously a neural network and the web of life can be seen as a vast set of nodes that are self-organizing. Why should its mirror in the Internet be any different? And the creation of a brand platform network is an organic expression of its capabilities.

The history of successful networks and cable channels is that they have been branded by a defining show or personality, which encapsulates a broadcast entity’s mission and identity. Early television networks ported over successful radio personalities as a way of not only bringing along their loyal audiences, but as a way of defining their program offerings—whether it was comedy, drama, soaps, game shows, and other familiar formats. Though it started in 1948, the ABC network really established itself in the 1970s as differentiated from its predecessors, CBS, and NBC, with the introduction of innovations like “Monday Night Football” and the Sunday Night “Movie of the Week”.

Later in 1986, when Fox arrived, it distinguished itself as the “18-34” demographic network through younger skewing, slightly more risqué shows like “The Simpsons” and “Married With Children” that took familiar formats to new extremes. MTV was not on the map until it broadcast “Remote Control”, its own version of a game show that turned a storied format on its head and broadcast to both audience and advertisers that this was not your parent’s network. Successful branding by Emeril Lagasse established the Food Network with a larger-than-life, chef character.

The idea of personalities defining networks is as old as Ed Sullivan, Walter Cronkite, and Dan Rather branding CBS, Johnny Carson and Bill Cosby branding NBC, and Roseanne and Barbara Walters branding ABC. Audiences like to identify with personalities and characters. A nascent network program development strategy should be informed by an active search for talent and defining show concepts that can attract viewers, differentiate its value proposition, and compel advertisers to invest.

The emergence of watermarks on broadcast and cable television during the 90’s was no accident. In a cluttered landscape of multiple brands with often-redundant program offerings, it became an essential feature to help audiences know what entity they were actually watching. Whether a new network undertakes long or short-form programming, it needs to be packaged in a way that is clearly branded. This is particularly the case for any web-related or mobile video. Current effective tactics for syndicated and seeded web video content branding include end plate, white-wrap, and vanity URL techniques to ensure tracking of traffic directly related to video campaigns.

The diversity of media choices to connect brands with consumers has never been greater. The challenge presented by this opportunity is how to integrate the real with the digital world, and to make a strategic assessment of which channels represent the optimal means of reaching audiences wherever they are. The media and marketing networks of the future will be integrated. Brands will all be broadcasters.

While it is still early—especially with networks and agencies learning the hard way about paid blogging and in-your-face tweeting—organic growth and behavioral change will fuel the eventual integration of platforms. Movies did not replace radio and either did TV. The Internet did not kill television, as many web evangelists in its late 90's go-go years predicted. Social media will not replace prime time programming. On the contrary, several IPTV startups are looking at permanent integration of Twitter. Putting chips down on all of these new marketing channel markers is strategic, but allocation of resources and investment needs to be measured, given that ROI, SROI, and analytic models are still evolving. Traffic is still a key indicator, but is no longer enough of a metric; measuring influence is another step in the right direction.

Every new medium defines its own market at the same time that it forces extant media to redefine their own market share. Survival depends on companies and creative talent being able to recognize and optimize the unique value that differentiates one from another, and to provide the appropriate content accordingly. The marketing networks of the future will identify the best means to reach their intended consumers to create value and an optimal, personalized experience with their respective brands and content. They will also ensure that consumers are empowered with information and enabled through the multiple channels they provide to be active participants in evolving brands to be better. 

In the future, brands should be more persuaded by the words of poets, than by marketers. Or in the words of Emily Dickinson: 
                                                    
                                                    Tell All The Truth
                                            
                                            Tell all the truth but tell it slant,
                                            Success in circuit lies,
                                            Too bright for our infirm delight
                                            The truth's superb surprise;

                                            As lightning to the children eased
                                            With explanation kind,
                                            The truth must dazzle gradually
                                            Or every man be blind.

We are all nodes on the network of all networks now. Or to let Shakespeare have the final word, "'Tis true, there is magic in the web of it." 

Saturday, October 23, 2010

MAD MEN’s DON DRAPER: NOWHERE MAN IN SEARCH OF TRUTH IN ADVERTISING


“Who is Don Draper?” is a question which is one of the central themes of the hit AMC series. A friend asked my opinion of this season’s finale and about how I thought the series would ultimately end. It got me to thinking about a lot more than Don’s dilemma and urge to confess. My confession is that I have a love/hate relationship with the show.

I grew up in the era that it depicts so well—what I hate is the mirror reflection of what I remember of that time as a child of divorce. But I what I love is the great dramatic craft and wonderful acting—even though, I did pick up one anachronism last year. In the offending episode, the ad guys celebrated on one occasion by breaking out a bottle of Stolichnaya—a gesture that certainly would not have gone over too well in the Cold War Era. I’ve subsequently seen bottles of Smirnoff in later episodes, which has righted the situation. That said, there is so much to admire about the art direction and attention to period detail that it almost makes you want to take off your fedora, take out a pack of Lucky Strikes, and reinstate the three-martini lunch ritual.

But getting back to my friend’s questions, I told her that I thought that the last scene of the finale was unnecessary. When I saw the masterful scene before it, I thought that it was a great open ending—after Don tells his ex-wife that he’s getting remarried, we see an empty bottle on the kitchen counter of their former family home, framed center stage like a dark, Courbet still life as the lights are shut off by the departing former couple in what I thought was a fade to black and end credits. The question becomes—is there a spark between them that will come between Don’s impending new marriage? We already should have reservations about the match knowing that he is an inveterate cad and his impulsive decision to marry his secretary does not auger well for faithfulness or longevity.

But the writers couldn’t resist the obvious and my enthusiasm was quickly dampened because it didn’t end there. They chose to continue, and cut to Don and his fiancée in bed with Sonny and Cher’s “I Got You Babe” playing in the background. Now, the inference here—and a rather heavy handed one I thought—whether it is subconscious or not—is that the duo singing on the soundtrack had a fairy book Hollywood marriage that ended in divorce—hence, the seemingly sentimental, hippie love song casts a foreboding shadow over the betrothed couple lounging in bed. This final scene seemed to lack the subtlety of the one beforehand—do we really have to spell everything out in TV America? The kitchen scene struck me as something you’d see in a foreign film. The bed scene, typical soap opera.

OK. I’ll stop producing the show. Let’s get back to Don. He’s got a secret that is now burning after the last season. The perfection of his character is written down to his Dickensian name—his “adopted” namesake of “Draper” lends itself to two meanings—his hidden identity is sequestered under the draping of what happened in the Korean War; the other, is that, in addition to hiding personal truths, he is the perfect Ad Man because he’s so adept at all the shadings of truth that his profession requires.

The well-spun phrase “Truth in Advertising” (actually a law) certainly has a different spin to it especially with the platform now offered by social media. Consumers can instantly flex their muscles and spark negative PR grassfires that can grow into the kind of outright conflagrations that have sometimes brought corporate giants to their knees—remember the Domino’s pizza disaster when several misbegotten pizza twirlers posted a video on YouTube showing them adorning their pies with toppings that were…shall we say, not on the menu, but definitely organic? There are countless other examples that have motivated most major corporations to preemptively hire legions of twenty-year-olds to maintain a vigilant watch in the blogosphere for negative consumer rumblings. Mad consumers can now be an activist virus.

Back in the 1960’s however, we were living the American Dream and drinking the Kool Aid that turned us into that consummate culture of consumption which has for the last several decades displaced our standing as a manufacturing country—the rest as they say, is subject of daily reports on the unemployment rate, foreclosures, and the general Fall of the Dow Jones and perhaps Empire. One of the many things that “Mad Men” gets so right is the way that we were sold and bought a bill of goods in the 50’s and 60’s from unfiltered cigarettes to the bomb shelters that we didn’t need. It all seemed so simple—merely “duck and cover” until the inconvenience of an atomic attack passed over and we could return to our regularly scheduled programming.

So, Don Draper is really the Beatles’ “Nowhere Man” come several years early and in some respects, he is a reflection of several generations who lived through the post World War II era. He’s caught in between the sheets as a relationship train wreck that doesn’t know who he is and between the 50’s and the 60’s that are starting to explode as we see in the season just concluded.

Don is a hybrid archetype of both T.S. Eliot’s “hollow men”, and the discontented businessman captured by Sloan Wilson’s, “The Man In The Grey Flannel Suit”, a 50’s best seller and hit movie with Gregory Peck and Jennifer Jones in 1956. Like this work, “Mad Men” has appropriately been celebrated as capturing the mid-century American “zeitgeist” just as “The Social Network” has been cited as doing the same for the Internet Era. The irony about Don’s secret is that he is about to enter a time period when his identity problem is
no longer be relevant.

Even though I enjoy the series, I hope that next season is the last. In America, the lifespan of a TV series is not motivated so much by its organic narrative shape and pulse, but by the economic imperative of reaching the magical goal of a syndicatable 65 episodes. This is the threshold for the number of episodes that can be “stripped” or be distributed as “repeats” five days a week for roughly half-a-year (26 weeks) before they have to recycle and repeat. One of the reasons that foreign, and in particular, British TV series seem to have an edge to them—take the seamlessness and naturally closed narrative arc of a classic series like “The Prisoner”, for example—is that rather than produce a show until it runs out of steam, foreign shows are written and produced as so-called “limited series”, a standard emulated well by some US cable network shows.

The end of “The Sopranos” was roundly criticized as a cop-out by many critics and fans, and is a prime example of how the American system is wanting at times. Sitcoms may be one thing to draw out as long as the stars stay the relative ages of their characters, but drama is better written as an entire story arc at the outset rather than running on tracks that have no final destination in sight—except having enough episodes to syndicate.

So, how should “Mad Men” end? Here’s my take: An energized client pitch is disrupted at the agency offices as a growing brouhaha emanates from the New York streets below. It's the sound of thousands of Peace Marchers parading to protest the Vietnam War and starting to fight with hooting construction workers. Maybe according to the series’ lifecycle, it’s a tad early for this and I’m committing my own anachronistic crime, but time lapse could help the series get through the inevitable relationship body counts which predictably lie ahead.

For all we know, Don may have already dropped acid in a future episode, thus confusing his identity issue even further like so many who psychedelicized. Now, with a burgeoning Peace Movement and Hippie Scene converging on our Nowhere Man, he is overcome by curiosity as everyone in the pitch meeting is drawn in astonishment to the high window to look out over the spectacle of history in the making. Impulsive to the end, he bails on the pitch and descends to the street. On the ground, he is caught up in the crowd, looking unsure of himself as his tie is loosened and jacket pawed by hippie chicks who welcome the “straight”. We last see him as he looks around in disbelief, not knowing whether to join “the parade” or run for his life. What he realizes quickly is that his desire to confess and his problems, in the immortal words spoken by Humphrey Bogart at the end of “Casablanca”, don’t “amount to a hill of beans” compared to the march of history. And the audience doesn't know either. End of story.

That way, Don Draper represents a whole generation of Mad Men, who like my father, were all so convinced that they were defined by their work. Blame the Cold War or Madison Avenue. Don is only special because he had to deploy a mask to cover up an identity issue that is no longer a big deal when assassinations, LSD, and Vietnam ripped open the facade of the mythic 50's/60's “Ozzie and Harriet”/I Like Ike/Apple Pie/Take A Letter/Zone, and everyone was revealed as not knowing who they were, where they belonged, and what tribe was right for them...Ultimately, Don can only find the redemption we all hope for him once the women finally take over, so maybe he gets hit on the head at the end with all the secretaries’ burning bras as they fall from high out of the agency office windows like a snowy ticker tape parade over Madison Avenue.

And now, for these messages from our discorporate sponsors…

(With thanks to Sarah Kelley)

Wednesday, May 27, 2009

HOW TO THINK INSIDE THE PYRAMID REDUX


For those readers who have been following the continuing, late breaking story from Old Kingdom Ancient Egypt of nearly 5000 years ago, I was recently graced and flattered by an email from the man who solved the great mystery of how the Great Pyramid was built:

Hi Kevin,

I'm Jean-Pierre Houdin and I'm very pleased with what you wrote about my theory...
HOW TO THINK INSIDE THE PYRAMID
Thank you...

I don't know if you had the opportunity to watch the NatGeo USA documentary about my work?

You should watch it:

http://www.youtube.com/view_play_list?p=3442C0E0D8EA2A33

Or you have the BBC2 Timewatch version:

http://www.youtube.com/view_play_list?p=0E083435887644B5&search_query=pyramid+houdin

A French documentary was also edited last year and was broadcast in many European countries.

The Japanese television NHK will broadcast their own documentary in Japan in the coming months.

I've received hundreds and hundreds of e-mails from all over the world, all very positives and most of them with these remarks I picked up from your blog:

"The jury may still be out in terms of how traditional Egyptologists have reacted to Houdin's theory, but to me, the idea makes logical, if not just plain common sense".
.../...

"The logic of Jean-Pierre’s theory is transparent and struck me as a breakthrough. It just made sense".

Egyptians were rationalists...The way they built the large smooth pyramids of the 4th Dynasty...makes sense (their "know-how")...
And they were as smart as we pretend we are...45 centuries ago...
The question : "How the pyramids were built" is our problem, not their...They built the pyramids...Period.
But since 200 years, all the guys willing to explain the construction started from a unique wrong idea: OUTSIDE...
Their answers are wrong from line one because the base of the studies is wrong...
How can you explain something when you start wrong?

I didn't invented anything, I just understood HOW THEY BUILT THE PYRAMIDS...I'm an architect...and that helps...a little...

The guys who deserve something are our Egyptian Ancestors...

What impressive work they did...

You should have a look at:

www.3ds.com/khufu

and

www.construire-la-grande-pyramide.fr

If you want more information, feel free to ask.

Best regards

Jean-Pierre Houdin

Friday, February 20, 2009

IS PERSONALIZATION REALLY THAT "PERSONAL"?



Do you ever wonder what the meaning is behind the words that we use everyday? I admit that I’m a geek when it comes to etymology. My fetish is word origins and especially tracking down the roots of words that we just toss off, often without thinking much about them. I like to rustle through the OED and various etymological dictionaries, lexicons of slang, clichés, and the like at random, just to see what turns up.

The Internet has been widely acclaimed as possibly the greatest social transformer since Gutenberg’s invention of the printing press and moveable type. Among other things, the Web has made community, interactivity, and personalization standard features if not demands, and even requirements of contemporary life—at least for many of the billion people who are now online.

I’ve read a lot about social media in the last year—whether in The Economist or in such books as Wired writer, Jeff Howe’s Crowd Sourcing: Why The Power of the Crowd is Driving the Future of Business, Clay Shirky’s classic Here Comes Everybody: The Power of Organizing Without Organizations , and John Clippinger’s A Crowd of One: The Future of Individual Identity . Now the power of group think and action is not new as Howe points out. Early hunter societies quickly learned that two brains—or at least two atl atls—were better than one (a basic meaning of “crowd sourcing”), in striking down Pleistocene prey. Even though all of these books are about leveraging the many, they have made me think about what “personalization”of the individual—in the context of the Internet and technology—really means.

MySpace, iPhone, YouTube. It’s all about the individual one might think at first blush. Ostensibly, “personalization” means customizing features to suit an individual’s taste and style. But, are we really being bamboozled a bit here? When you’re setting up your Facebook or LinkedIn profiles, for example, aren’t you being crammed and compartmentalized into convenient categories of somewhat generalized interest? I mean, netvibes and other RSS aggregators offer the convenience of creating a semblance of your very own newsstand. Maybe there’s a precedent even in print media for The New York Times masthead still announces, “All The News That’s Fit To Print”, which some cynical, if insightful soul once suggested should really read, “All The News That Fits.” But, at the end of the day, isn’t a lot of information being left out for the sake of making it all fit—whether it’s in The New York Times or our social network profiles?

When you look up “personalization” in The Barnhart Dictionary of Etymology, you are guided to its root in the word, “personal”. The use of the word apparently goes back to before 1387 when it was borrowed from the Old French word, “personel”, which came from the Latin word “persona”, which we are told meant to describe “a person”. More interesting is that its use to describe individuality or a distinctive character, was first recorded in 1795. Before the tide of European and American revolutions, which occurred just prior to that time the only individuals of note were generally monarchs and the royal classes who worked for them. Otherwise, there were the great masses or “commons”.

Even in the field of astrology, natal readings for individuals--excepting monarchs and royals--were relatively unknown prior to the 18th century. Mundane Astrology, as it was called, was the province of figuring out the future for countries and rulers, but the Average Joe was of little consequence in the prognostications of court astrologers. The rise of the individual, then, may be echoed in the actual need for the word “personal” to describe something more than just “a person”.

So, in describing ourselves within the social network “city limits” of a profile page, something has to go. Clippinger’s book provides a perspective from "social physics" with a debt to anthropology and sociology that says we are defined as individuals, in part, by our desire to be part of the crowd—and by what is reflected back to us by others and what they think.

It reminds me of something John Densmore, the drummer of The Doors, said to me once when I asked him shortly after the release of Oliver Stone’s biopic, what he thought of the movie. Given the troubled production during which the three “surviving” members of the band were all consultants—and then decided to bail “due to creative differences” with the director—John was quite diplomatic. “Well,” he answered, “I guess when they make a movie of your life in two-and-a-half hours, they’ve got to leave something out…”

Maybe when you are trying to personalize a medium that is far more than a mass medium—arguably the first truly global medium, you don’t want to design a network that will unravel out of accommodating too much uniqueness or the truly customized. Are we then losing anything of our originality in the process of being conscripted by the need for interactivity and community socialization that the Web indulges and has made de rigeur?

Jeremy Rifkin described to me how the Baby Boomers’ parents were the last generation who had a historical frame of reference—in other words, they defined themselves by looking back at World War II and the Great Depression. By contrast, Jeremy said that starting with the Boomers, the generations following were all defined by the Self and self-reference. The Boomers and those to follow are all “therapeutic generations”.

Western Psychiatry and Tibetan Buddhism would say that the Battle of Ego is one that we all face as human beings. In this Battle, we are thrown into an ongoing war that in essence seeks a balance of power between a healthy sense of self and the egoistic behavior at the root of neurosis and psychosis that damages others and therefore, ultimately ourselves. Who knows that the Web is now providing us with a playground where we will lose the Battle as our personal identities become branded by misleading marketing prefixes like “My” and “i” or by fitting ourselves neatly onto a profile page.

While Salvador Dali once remarked, “Perfection is to get lost!”, I don’t think we should give up the ghost without a good fight because technology undoubtedly brings with it benefits and progress, but when machines create efficiencies for us, what do we lose in the process? Is there another kind of "identity theft" at work here? There is no free lunch when we are not only consumers, but what is consumed.

Sunday, February 15, 2009

WHY THE WEB IS A TIME MACHINE


Have you ever noticed how your sense of time is affected when you are online? I remember when I first started exploring the Internet (in a bygone era when such activity was somewhat cutely described as “surfing the web”) and being interrupted by my wife at around 3 in the morning when she asked, “Do you know what time it is?”

I’d been online since early that evening and to tell the truth, I actually had absolutely no idea what time it was—or the hidden subtext buried in her question. When I contemplate how the Web has changed since then, one of the things that stands out is that the novelty of finding the new may have dissipated, but there is still a sense of being in a different time zone when online. Today, the so-called “three-second rule” which seems to rule a lot of web marketing and behavior dictates that a site’s “call to action” or “value proposition” must be readily placed in the upper right quadrant of the screen in order to capture the nano attention span of the current day web user.

So, in the 15 years or so since the first web browser, Mosaic, we seem to have narrowed our field of vision with an increased demand for instant gratification—the YouTube cannibalistic effect to see new video after new video as a contrast to appointment and series viewing habits that once dominated broadcast television—and now, out of the 11.8 billion web sites and blogs (as of 2005) to choose from, it’s given new meaning to the next channel is “just a click of the remote away”.

In this case, the next site is just a mouse click away and seems to be the result of a generational change as much as one directed by so much choice. This seeming infinite sprawl of sites called for an organizing principle much like a contemporary Alexandrian Library—just with all of its index cards thrown chaotically into virtual space—hence, the search engine appeared on the scene and now Google famously or infamously owns much of that universe.

Still, the one thing that infinity seems to belie is that we have a lot of time on our hands. I joke with some of my friends that they must have full-time staffs to manage their social media accounts. And Twitter is the most recent example of breaking down time into the nano. It appears that just like some virtual Alice, we are getting both smaller and longer in time through the Web. I still often forget how long I’ve been online—even if the time I spend “surfing” has been replaced by more targeted use. And when I think about where the time is going when I am online, I am often reminded of something that the Austrian Spiritual Scientist, Rudolf Steiner once said at the beginning of the 20th century.

In one of his more obscure papers, he predicted that by the end of the century, a new life-form would appear that was both non-biological and would grow in parallel with biological life-forms by using their energy to propagate itself. What if that’s where all the time is going? Sounds kind of creepy, but the reality is that the silicon-based life-forms have already arrived and may be thriving quite well on our backs.

In his 2007 book, New Theories of Everything, English cosmologist, theoretical physicist, and mathematician, John D. Barrow , writes: “Today, a science fiction writer looking for a futuristic tale of silicon dominance would not pick upon the chemistry of silicon so much as the physics of silicon for his prognostications. But this form of silicon life could not have evolved spontaneously: it requires a carbon-based life-form to act as a catalyst. We are the catalyst. A future world of computer circuits, getting smaller and smaller yet faster and faster, is a plausible future “life-form”, more technically competent than our own.”

Barrow’s last statement certainly gives pause. First, it certainly gives new meaning to the notion that the Singularity is Here (see Ray Kurzweil's book and also . Additionally, any advanced extraterrestrial tourists cruising in our galactic neck of the woods and seeing just how little we have evolved since the Upper Paleolithic with wars and climate change ruling the day—not millennia—would give this small planet an “F” on its cosmic report card. So, perhaps it’s not a big stretch that machines can evolve as a “more technically competent” species than us. And maybe we should be more conscious where the time goes when we are in the virtual pipeline looking for that perfect, next breaker.

My son has a Time Machine book that came with a delightful pocket watch whose hands and numbers run backwards. I’m still trying to figure out how to wind it. But, it’s led me to ponder that it would be interesting if human life had its own version of a web “history” or back button that worked as well as it does for the new breed of silicon-based life-forms. If they should eventually ask us to join them, perhaps this would be a deciding factor in their favor. As Jorge Luis Borges once noted, “The future is inevitable and precise, but it may not occur. God lurks in the gaps.” Maybe the best that our partnership with technology can do is to point us in time in the direction of the gaps as we surf between the waves of web pages and electricity. Or as Tuli Kupferberg of The Fugs once said, “I now pronounce you Man and Machine.”