Saturday, February 28, 2009

SOCIAL MEDIA SMALL AND BIG


Social Media has done a lot for fame. While reality TV shows went a long way to enabling the average Joe and Jane to realize Andy Warhol’s prediction that everyone would be famous for fifteen minutes in the future, I read recently that Social Media is ensuring that in the future, at least social network users will be famous to fifteen people.

Reality TV has also changed our notions of reality—especially since it is actually a highly scripted medium—but I won’t give up any trade secrets here and the question of "what is reality after reality shows?" is better left for another post. While the medium of the Web has famously wired us all as global citizens—at least the billion of us who are now online worldwide—social media has also had the flipside of making our local "neighborhood" more relevant.

Social networks are now populated by over a quarter of a billion users, so the possibilities of growing one’s own network seem as big as a customized pyramid scheme. Facebook, however, places limits on how big or famous you can actually be. Currently, you can have up to 100 Friend Lists and up to 1500 friends per Friend List. The multiple isn't bad, considering that if you maximized your Lists to the limit, it represents the reach of a new music release that has done extremely well by today’s standards. Remember when a gold record award celebrated sales of CDs, tapes or vinyl of over 500,000 units and the platinum of one-million? Now, a 50,000 seller is cause for…well, some kind of celebration.

So, the Social Media Effect has made our own local universe of possibilities expansive, but compared with the expanding universe of Internet pages numbering over a trillion last summer, the world of social media has, in fact, shrunk-wrapped us all. One might find a distributed computer strategy handy to manage a social network that expanded beyond the known limits circumscribed by hosting and bandwidth on MySpace, Facebook, Bebo and other networks. Even for the media famous, there seems to be an organic eco-system dictating just how big social networks can grow—managing director of Garage Technology Ventures, Apple fellow, co-founder of Alltop, author and guru, Guy Kawasaki has 11,290 fans on his Facebook page. Still, if Guy decided to record and release an album to his "fans" and it sold through, he wouldn’t even make a dent on the Billboard charts.

What is happening has been best described just by the title of marketing genius, Seth Godin’s worthy and fun book, Small Is the New Big: and 183 Other Riffs, Rants, and Remarkable Business Ideas. It’s not necessarily a new idea—economist E.F. Schumacher’s 1973 book, Small Is Beautiful is not only a seminal text in its call for sustainable development, but in its advocacy of the small. A core idea growing out of the author’s study of village-based economics is its assertion of “Buddhist economics” summarized in Wikipedia: "[A modern economist] is used to measuring the 'standard of living' by the amount of annual consumption, assuming all the time that a man who consumes more is 'better off' than a man who consumes less. A Buddhist economist would consider this approach excessively irrational: since consumption is merely a means to human well-being, the aim should be to obtain the maximum of well-being with the minimum of consumption…The less toil there is, the more time and strength is left for artistic creativity. Modern economics, on the other hand, considers consumption to be the sole end and purpose of all economic activity."

The notion of the small having a far reaching impact goes back in Western culture to the Butterfly Effect theory first described in 1890, later in a 1952 Ray Bradbury short story about time travel (informing the phenomenon known as time paradox), and made popular by Edward Lorenz as part of Chaos Theory and his study of computer models of weather prediction.

Lorenz published his work in 1963 and presented it at the New York Academy of Sciences where "One meteorologist remarked that if the theory were correct, one flap of a seagull's wings could change the course of weather forever." The seagull was eventually replaced with the poetry of the butterfly and at an American Association for the Advancement of Science in 1972, Philip Merilees concocted a title for Lorenz’s paper as: “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” When we update our social network status, we are flapping our virtual wings.

Social networks are an organic growth the Web’s key value proposition of personalizing the world. The expansive potential of one’s personal social network is more akin to the relationship of the microcosmic—in this case, the personality—and the personal macrocosmic—or its potential reach to scores, hundreds, if not thousands of friends who then orbit our individual pages waiting for the next transmission of what we are doing, thinking, loving, sharing. But how deep can the quality and definition of “friend” be when it numbers in the thousands?

Charles and Ray Eames provided a stunning, classic video and book version of the relationship between the micro and the macro in their Powers of Ten. But humans need the comfort of scale in the form of intimate relationships, family, sports teams, fan sites, personal networks, and other tribal affiliations. The stars may be “all connected to the brain” as Pete Townshend once wrote in “Pure and Easy”, but sometimes the sprawl of thousands of stars on a clear night can present a canvas that is daunting as hell and not heaven in its infinite possibilities beyond human comprehension. We humans also have a tendency to get lost in a crowd.

Or as Elias Canetti once observed in his Crowds and Power, a fire sometimes has more power to unify a theater than a play can. The individual has the power to understand "the play is the thing" as a primary experience. But the natural force of fire has an elemental power that everyone understands with his or her reptilian brain—it’s fight or flight time, baby and ain’t no time to think about it when the whole shithouse is going up in flames.

Social Media theory posits that the group mind and crowd actually has the power to think and maybe even think better than one lowly Mensa member. But Canetti also said that the ultimate crowd may be the tribal pack of spermatozoa out of which, only one, has enough fame potential to survive the swim upstream to party down with the egg. So, let's send out the smoke signal--Social Media needs its own Darwin to sort out the details of who will best survive.

Wednesday, February 25, 2009

NATIONAL NANO MEMORY


If you remember, the national dialogue in the US prior to 9/11 was secure in its comfort zone of celebrities and sex. In Los Angeles, the local conversation was about the LA Lakers who were anticipated to have another great season, an assumption that was celebrated by a growing number of Laker flags ceremoniously fastened to car windows and waving purple and yellow all over the LA freeways.

Cut To: Post 9/11, the Laker banners came down and were replaced by American flags, more numerous in number and sometimes larger in size when they weren’t the commercially bought versions. To say that a lot was lost in those days as a country is beyond an understatement. But, one of the things that we famously or infamously lost was the opportunity to embrace the groundswell of world opinion that was in our favor. The Bush Administration fumbled that ball when all but a handful of world nations sympathized with our plight and national tragedy.

Instead, and as we all know all too well, our leaders and news media cultivated and fed a national spirit of jingoism and Wild West style revenge leading to wars in foreign lands. They continue to haunt us at a weekly budget of $1billion and consume the declining American hegemony in an echo of Rome, which collapsed on the home front due in large part to the impossibility of sustaining its Empire militarily and logistically abroad.

As I joined the daily ranks of the freeway lemmings, it was hard to ignore to the procession of red, white, and blue accompanied by a medley of decals announcing “United We Stand” and the like. Still, it was easy to understand how hoisting and affixing the symbolic provided some grounding—if not hope—in the grim days and months following the attack on the World Trade Center. But about six-months after 9/11, I noticed that the American flags started to dwindle, then disappear altogether, and were replaced by the familiar swarm of waving Laker flags. It struck me as not only as premature to say the least, but somehow significant.

Did this transition back to cheering on a local sports franchise reflect something, however informal, about how long the national attention span actually was? Even though there was some self-reflective conversation largely on the so-called Left about why the attack occurred—round-up the usual suspects—Gore Vidal, Eqbal Ahmad, Noam Chomsky—it seemed that we collectively lost the opportunity to ask ourselves the many, hard questions—usually summarized by the Right and Left as “Why Do They Hate Us?”

My informal personal “survey” of the national short-term attention span may not be scientific, but it seems to me that we have long suffered from either a form of communal ADD or shoddy, selective memory. John Ralston Saul, named as one of Utne Reader’s 100 “visionaries”, observes in his prescient, 1995 book, The Unconscious Civilization: “…free speech and democracy are closely tied to an active, practical use of memory—that is, history—as well as an unbroken sense of the public good. Commerce has no memory. Its great strength is in its ability to constantly start again: a continual recreation of virginity. Commerce also has no particular attachment to any particular society. It is about making money, which is just fine, as far as it goes.”

I don’t want to add any gloom to the euphoria that has accompanied the entrance of the new Administration to Washington. Recognizing that President Obama is courageously facing an inherited “legacy” of his predecessor which is more like a firestorm, we are still realistically circling the drain of something that looks an awful lot like Depression 2.0. Several months ago, the “experts” finally proclaimed that it was “official”—we were in a recession—and had been so for a year. But, this wasn’t really news for a lot of non-experts who had not been waiting for the confirming metrics, but had seen it all too closely in the form of shrink-wrapped, personal financial circumstances.

If the six-month National Attention Span rule holds, perhaps all those red, white, and blue “Hope”, “Change”, “Progress”, and “Yes We Did” bumper stickers won’t be replaced, but the euphoria of potential change is due to leave us at the latest by springtime or six months after the Election. I obviously hope not and also hope that maybe this picture is just a shallow LA thing and not reflective of the entire Nation after all. Maybe it will just be Dodger flags fluttering in the smog. But somehow, I feel like I should be getting ready to hoist either the Jolly Roger or Tibetan prayer flags from my car window...and hopefully, it’s the latter. What do you think?

Friday, February 20, 2009

IS PERSONALIZATION REALLY THAT "PERSONAL"?



Do you ever wonder what the meaning is behind the words that we use everyday? I admit that I’m a geek when it comes to etymology. My fetish is word origins and especially tracking down the roots of words that we just toss off, often without thinking much about them. I like to rustle through the OED and various etymological dictionaries, lexicons of slang, clichés, and the like at random, just to see what turns up.

The Internet has been widely acclaimed as possibly the greatest social transformer since Gutenberg’s invention of the printing press and moveable type. Among other things, the Web has made community, interactivity, and personalization standard features if not demands, and even requirements of contemporary life—at least for many of the billion people who are now online.

I’ve read a lot about social media in the last year—whether in The Economist or in such books as Wired writer, Jeff Howe’s Crowd Sourcing: Why The Power of the Crowd is Driving the Future of Business, Clay Shirky’s classic Here Comes Everybody: The Power of Organizing Without Organizations , and John Clippinger’s A Crowd of One: The Future of Individual Identity . Now the power of group think and action is not new as Howe points out. Early hunter societies quickly learned that two brains—or at least two atl atls—were better than one (a basic meaning of “crowd sourcing”), in striking down Pleistocene prey. Even though all of these books are about leveraging the many, they have made me think about what “personalization”of the individual—in the context of the Internet and technology—really means.

MySpace, iPhone, YouTube. It’s all about the individual one might think at first blush. Ostensibly, “personalization” means customizing features to suit an individual’s taste and style. But, are we really being bamboozled a bit here? When you’re setting up your Facebook or LinkedIn profiles, for example, aren’t you being crammed and compartmentalized into convenient categories of somewhat generalized interest? I mean, netvibes and other RSS aggregators offer the convenience of creating a semblance of your very own newsstand. Maybe there’s a precedent even in print media for The New York Times masthead still announces, “All The News That’s Fit To Print”, which some cynical, if insightful soul once suggested should really read, “All The News That Fits.” But, at the end of the day, isn’t a lot of information being left out for the sake of making it all fit—whether it’s in The New York Times or our social network profiles?

When you look up “personalization” in The Barnhart Dictionary of Etymology, you are guided to its root in the word, “personal”. The use of the word apparently goes back to before 1387 when it was borrowed from the Old French word, “personel”, which came from the Latin word “persona”, which we are told meant to describe “a person”. More interesting is that its use to describe individuality or a distinctive character, was first recorded in 1795. Before the tide of European and American revolutions, which occurred just prior to that time the only individuals of note were generally monarchs and the royal classes who worked for them. Otherwise, there were the great masses or “commons”.

Even in the field of astrology, natal readings for individuals--excepting monarchs and royals--were relatively unknown prior to the 18th century. Mundane Astrology, as it was called, was the province of figuring out the future for countries and rulers, but the Average Joe was of little consequence in the prognostications of court astrologers. The rise of the individual, then, may be echoed in the actual need for the word “personal” to describe something more than just “a person”.

So, in describing ourselves within the social network “city limits” of a profile page, something has to go. Clippinger’s book provides a perspective from "social physics" with a debt to anthropology and sociology that says we are defined as individuals, in part, by our desire to be part of the crowd—and by what is reflected back to us by others and what they think.

It reminds me of something John Densmore, the drummer of The Doors, said to me once when I asked him shortly after the release of Oliver Stone’s biopic, what he thought of the movie. Given the troubled production during which the three “surviving” members of the band were all consultants—and then decided to bail “due to creative differences” with the director—John was quite diplomatic. “Well,” he answered, “I guess when they make a movie of your life in two-and-a-half hours, they’ve got to leave something out…”

Maybe when you are trying to personalize a medium that is far more than a mass medium—arguably the first truly global medium, you don’t want to design a network that will unravel out of accommodating too much uniqueness or the truly customized. Are we then losing anything of our originality in the process of being conscripted by the need for interactivity and community socialization that the Web indulges and has made de rigeur?

Jeremy Rifkin described to me how the Baby Boomers’ parents were the last generation who had a historical frame of reference—in other words, they defined themselves by looking back at World War II and the Great Depression. By contrast, Jeremy said that starting with the Boomers, the generations following were all defined by the Self and self-reference. The Boomers and those to follow are all “therapeutic generations”.

Western Psychiatry and Tibetan Buddhism would say that the Battle of Ego is one that we all face as human beings. In this Battle, we are thrown into an ongoing war that in essence seeks a balance of power between a healthy sense of self and the egoistic behavior at the root of neurosis and psychosis that damages others and therefore, ultimately ourselves. Who knows that the Web is now providing us with a playground where we will lose the Battle as our personal identities become branded by misleading marketing prefixes like “My” and “i” or by fitting ourselves neatly onto a profile page.

While Salvador Dali once remarked, “Perfection is to get lost!”, I don’t think we should give up the ghost without a good fight because technology undoubtedly brings with it benefits and progress, but when machines create efficiencies for us, what do we lose in the process? Is there another kind of "identity theft" at work here? There is no free lunch when we are not only consumers, but what is consumed.

Sunday, February 15, 2009

WHY THE WEB IS A TIME MACHINE


Have you ever noticed how your sense of time is affected when you are online? I remember when I first started exploring the Internet (in a bygone era when such activity was somewhat cutely described as “surfing the web”) and being interrupted by my wife at around 3 in the morning when she asked, “Do you know what time it is?”

I’d been online since early that evening and to tell the truth, I actually had absolutely no idea what time it was—or the hidden subtext buried in her question. When I contemplate how the Web has changed since then, one of the things that stands out is that the novelty of finding the new may have dissipated, but there is still a sense of being in a different time zone when online. Today, the so-called “three-second rule” which seems to rule a lot of web marketing and behavior dictates that a site’s “call to action” or “value proposition” must be readily placed in the upper right quadrant of the screen in order to capture the nano attention span of the current day web user.

So, in the 15 years or so since the first web browser, Mosaic, we seem to have narrowed our field of vision with an increased demand for instant gratification—the YouTube cannibalistic effect to see new video after new video as a contrast to appointment and series viewing habits that once dominated broadcast television—and now, out of the 11.8 billion web sites and blogs (as of 2005) to choose from, it’s given new meaning to the next channel is “just a click of the remote away”.

In this case, the next site is just a mouse click away and seems to be the result of a generational change as much as one directed by so much choice. This seeming infinite sprawl of sites called for an organizing principle much like a contemporary Alexandrian Library—just with all of its index cards thrown chaotically into virtual space—hence, the search engine appeared on the scene and now Google famously or infamously owns much of that universe.

Still, the one thing that infinity seems to belie is that we have a lot of time on our hands. I joke with some of my friends that they must have full-time staffs to manage their social media accounts. And Twitter is the most recent example of breaking down time into the nano. It appears that just like some virtual Alice, we are getting both smaller and longer in time through the Web. I still often forget how long I’ve been online—even if the time I spend “surfing” has been replaced by more targeted use. And when I think about where the time is going when I am online, I am often reminded of something that the Austrian Spiritual Scientist, Rudolf Steiner once said at the beginning of the 20th century.

In one of his more obscure papers, he predicted that by the end of the century, a new life-form would appear that was both non-biological and would grow in parallel with biological life-forms by using their energy to propagate itself. What if that’s where all the time is going? Sounds kind of creepy, but the reality is that the silicon-based life-forms have already arrived and may be thriving quite well on our backs.

In his 2007 book, New Theories of Everything, English cosmologist, theoretical physicist, and mathematician, John D. Barrow , writes: “Today, a science fiction writer looking for a futuristic tale of silicon dominance would not pick upon the chemistry of silicon so much as the physics of silicon for his prognostications. But this form of silicon life could not have evolved spontaneously: it requires a carbon-based life-form to act as a catalyst. We are the catalyst. A future world of computer circuits, getting smaller and smaller yet faster and faster, is a plausible future “life-form”, more technically competent than our own.”

Barrow’s last statement certainly gives pause. First, it certainly gives new meaning to the notion that the Singularity is Here (see Ray Kurzweil's book and also . Additionally, any advanced extraterrestrial tourists cruising in our galactic neck of the woods and seeing just how little we have evolved since the Upper Paleolithic with wars and climate change ruling the day—not millennia—would give this small planet an “F” on its cosmic report card. So, perhaps it’s not a big stretch that machines can evolve as a “more technically competent” species than us. And maybe we should be more conscious where the time goes when we are in the virtual pipeline looking for that perfect, next breaker.

My son has a Time Machine book that came with a delightful pocket watch whose hands and numbers run backwards. I’m still trying to figure out how to wind it. But, it’s led me to ponder that it would be interesting if human life had its own version of a web “history” or back button that worked as well as it does for the new breed of silicon-based life-forms. If they should eventually ask us to join them, perhaps this would be a deciding factor in their favor. As Jorge Luis Borges once noted, “The future is inevitable and precise, but it may not occur. God lurks in the gaps.” Maybe the best that our partnership with technology can do is to point us in time in the direction of the gaps as we surf between the waves of web pages and electricity. Or as Tuli Kupferberg of The Fugs once said, “I now pronounce you Man and Machine.”