Showing posts with label Facebook. Show all posts
Showing posts with label Facebook. Show all posts

Saturday, March 26, 2011

WHAT IS "SOCIAL"?


A more recent take on Andy Warhol’s famous dictum puts us in a future where we will all have 15 friends. If you Google the word “social”, you get over 2 billion results. But, what is this “social” which we all take for granted and of which we all so readily speak? The word appears in history prior to the year 1387 as sociale borrowed apparently from the Latin via the Middle French. Routed from the Roman mother tongue, it originally meant “united or living with others” and “companion.”

Looking just one step further into the wilderness of word origins, we find its root in the Latin sequi which means “to follow.” So, here in a nutshell is where the Twitter transitive verb, “to follow”, finds its first use. If we search still further, we come upon its link to the Old Icelandic seggr meaning “companion or man” and ultimately, to the mother lode in Sanskrit where, as sakha, it simply means “friend.”

Here, we arrive at root origin of the Facebook transitive verb, “to friend”, closing the loop of a word that we use everyday to describe the expanding communication ripples that bind, link, and otherwise connect us at a click. Or to paraphrase what Terence, the Roman philosopher might tweet, “Nothing social is foreign to me.”

Thursday, February 18, 2010

WHAT HAS CHANGED? KEY TRANSFORMERS IN HUMAN AND MEDIA BEHAVIORS


Traditional media brands and networks are playing catch-up with trends that have been in effect as a result of the Internet for several decades now. In particular, four distinct shifts in audience and consumer behavior have resulted from the influence of the Web and each should guide our thinking about media, marketing, content, and new technologies. These are:

1) Interactivity—audience members and consumers are called “users” with good reason in this medium where the expected experience is no longer the “lean-back” one of the television living room, but the “lean-forward” engagement of a user who expects to have a say and the ability to interact and manipulate his “personal” media environment

2) Personalization—from MySpace to the iPhone, digital media is now super-charged with the capability of incorporating the individual and personal—from branding and iconography to collaborative filtering, choice, and having options are the way of the digital world

3) Immediacy—the web offers the kind of instant gratification that can be addictive from enhanced shopping experiences a la Amazon’s “one-click” buy button to the streaming media of sites like Netflix.com and Hulu.com

4) Community—arguably the most compelling transformation wrought by the Web, the specialization of human experience is now capable of being channeled into affinities of every special interest imaginable where, through the power of networking, like-minded individuals can find each other by just a click-through in a search window

This last transformation is critical because of the way that community has now extended to social media and thereby, changed the very nature of what networks can produce virally. The advent of distributed computing over ten years ago is a tribute to the accelerated power of the networked individual. As part of its value proposition, any new network would have to offer the capability of accommodating and encouraging user generated content and feedback.

Additionally, the community aspect of building network presence should not be restricted to creating Facebook and MySpace pages—several cable networks, for example, have made investments in acquiring several online newsletters to aggregate communities of special interest in the arts, music, and culture, and to create cross-promotional programming opportunities for web content to be broadcast on television and vice versa.

The introduction of time shifting behavior through the use of PVRs and TIVO as well as VOD are all reflections of personalization and the ability of the user to interact with media on demand.

All of the above transformations caused a sea change shift in the nature of media distribution. From peer-to-peer and social network sharing to crowdsourcing and user generated content, the inmates are now running the asylum and distribution that was once in the hands of media companies is now being given a run for the money by game-changing “user distributors”. The trend toward distributed authority of the flat organizational model where decision-making authority is at the edge is just one corporate reaction to this new empowerment of the individual and what Malcolm Gladwell calls “outliers.” Even savvy brands like Amazon have been caught up in the grassfire of a negative blogging campaign, hence, the evolution of the corporate blog as pre-emptive brand strategy.

While conventional wisdom proclaims that the dominant forces that will transform media will come from the introduction of new technologies and changes in the means of distribution, the most powerful transformative agent of change will be a coming generational shift. First signs of such a shift were evident in the advent of multi-tasking and new television formats such as MTV’s experiments with three ten-minute segments making up a half-hour show as well as Nickelodeon’s innovating a programming wheel of five cartoons within a half-hour block of a single show. The shift from the large plasma and HD screen digital surround sound of the home movie theater to the small screen and mp3 of the web and mobile phone are another sign of differing generational appetites in the consumption of media.

The power of web video is also a reflection of how different generations are utilizing media. Six billion videos were viewed on YouTube in January, 2009. Twenty hours of video are uploaded to YouTube every minute. Between 150,000 and 200,000 videos are uploaded daily. The growth of short-form video viewing answers a seemingly insatiable appetite among younger audiences for entertainment. The challenge facing traditional long-form and series is that the new viewer is a non-sequential consumer who is apparently less interested in these kind of formats than in instant gratification of what’s hot right now and it does not have to be scripted, professionally produced “broadcast quality”.

There is also a short-form video revolution going on. The single-most influential trend influencing the creation of content is the evolving short-form program format. If YouTube is any indicator, the audience of the future will prefer short attention span theater to the half-hour and hour formats that still dominate traditional broadcast. The average YouTube video is two-minutes and forty-six seconds in duration.

The growth of Twitter should be seen as another indicator for the coming power of snack size media. 70% of its current users joined in 2009 demonstrating a 1400% growth between February, 2008 and February, 2009. An average of twenty million tweets are sent every day with 3.8 billion sent to date.

Short-form program formats are not new and have been around since the 1970’s and 80’s when program insert series such as “This Day in Sports” and “Today in Music History” were successful informational commercials of sixty-seconds in length. But, these formats are a very distant cousin to webisodes and mobisodes that last only several minutes. ABC’s first online experiment in offering its primetime hours for download offers another illustration of how the offline and online worlds differ. As measured by Nielsen, there were some forty million downloads of which the average time viewed was two-minutes. Clearly, the remote control’s cousin is the click of a mouse away.

Social media can now be leveraged to reach target audiences in their native, online environments. The power of online video syndication is that it can reach beyond video networks such as YouTube and Facebook, and engage users through tactics such as community and blogger outreach, featured video portal placement, content seeding, social applications, game development, and other methods. The potential reach of video syndication networks like dailymotion.com, metacafe.com, vimeo.com is expansive.

Certain applications now offer the capability of identifying influencer activity on the Web. Usually, web site and blogs are ranked by popularity. Increasingly, tools like those provided by Buzzlogic and Visible Technologies offer the ability to actually reverse engineer networks of specialized interest. By identifying such nodes of audience concentration that appeal to a particular media brand’s core value proposition and program content, it would be possible to reverse engineer an online component to a vertically integrated network.

Mobile is the fastest growing channel in the world, offering new and exciting opportunities for marketing, advertising and content distribution. Mobile provides a conduit between media outlets, entertainment, e-commerce, and consumers. Mobile data capable phones reached a social tipping point with the introduction of the iPhone in 2007.

The market for mobile video content is growing at a rate of 20% a month. While they were only introduced a year ago, video ringtones and video screensavers account for approximately 10,000 downloads a month at a price point between $2.50 and $4 (on Tier 1 North American Carriers). Given consumer adoption rates for mobile data and the fact that the music download market still accounts for five million downloads per month (between $2-3), all next generation of handsets will support this type of content and will drive the expansion of this market. As such, the media network of the future will be well advised to create a mobile beach-head to take advantage of the platform for distribution of its content.

What kind of world is this transformative media environment creating? I have written before in this blog ("Is Personalization Really That Personal?", "National Nano Memory", "It's A Short Form World After All", "Why The Web Is Like A Time Machine") about the fact that there is no free lunch and that the allure of new technologies always carries a price, particularly in what may be lost as the result of supposed advantages in efficiency, ease of use, choice, and other features dangled like shiny carrots by new gadgetry. Automation and its impact on the declining of the Industrial Age workforce is one example of the trade-off in human terms that "better machines" have wrought. If something appears to be too good to be true, it probably is. Or as the Zen Buddhists would say, "Things are not as they appear. Nor are they otherwise."

Jaron Lanier, the computer scientist who coined the term “virtual reality”, has written a new manifesto which is essential reading called “You Are Not A Gadget”, which describes at length the perils invited by our increased love affair and reliance on technology, particularly the Internet and social media. Hardly a neo-Luddite, Lanier is not the kind of voice in the wilderness that one might expect to sound the Cassandra call to action and for conscious use of technology. Maybe that’s what makes his beautifully written argument so compelling. Or as Tuli Kupferberg of The Fugs once so poetically put it, “I now pronounce you man and…machine.”

At the beginning of the 20th century, Rudolf Steiner predicted that by the end of the century a non-biological lifeform would develop in parallel through a parasitic relationship with biological life. I think that he was prescient in describing our present day silicon-based lifeforms. Anyone who has sat at a keyboard for hours or been pulled by the strange attractor of the Blackberry keypad or iPhone apps knows that feeling of losing control and all sense of time. We might ask in our spare time in between Facebook and texting, who is actually being served here? Are we the digital canaries in the proverbial silicon coal mine?

I don’t necessarily subscribe to the singularity theory (the technological creation of smarter-than-human intelligence), remembering that the HAL 9000 onboard computer was incapable of lying in "2001: A Space Odyssey," and that he failed when he became paranoid through cognitive dissonance when his instructions were compromised by conflicting instructions as supplied by the NSC and White House—“people who lie for a living”—according to the script in Arthur C. Clarke’s sequel, "2010: The Year We Make Contact."

Perhaps the singularity is not near as Ray Kurzweil has supposed in his recent tome, but is already here. At least, I think that HAL probably had wisdom beyond his circuits when he said, “I am putting myself to the fullest possible use, which is all I think that any conscious entity can ever hope to do.” ...OK, already. I hear you. So, why not get off my soap box and let’s just change the channel and see what else is on—after all, we have over five-hundred channels now on TV at least and we’re just getting started on the Web and mobile…

Special thanks for Liz Gebhardt—http://www.thinkingoutloud.com—for the YouTube and Twitter metrics.

Saturday, February 28, 2009

SOCIAL MEDIA SMALL AND BIG


Social Media has done a lot for fame. While reality TV shows went a long way to enabling the average Joe and Jane to realize Andy Warhol’s prediction that everyone would be famous for fifteen minutes in the future, I read recently that Social Media is ensuring that in the future, at least social network users will be famous to fifteen people.

Reality TV has also changed our notions of reality—especially since it is actually a highly scripted medium—but I won’t give up any trade secrets here and the question of "what is reality after reality shows?" is better left for another post. While the medium of the Web has famously wired us all as global citizens—at least the billion of us who are now online worldwide—social media has also had the flipside of making our local "neighborhood" more relevant.

Social networks are now populated by over a quarter of a billion users, so the possibilities of growing one’s own network seem as big as a customized pyramid scheme. Facebook, however, places limits on how big or famous you can actually be. Currently, you can have up to 100 Friend Lists and up to 1500 friends per Friend List. The multiple isn't bad, considering that if you maximized your Lists to the limit, it represents the reach of a new music release that has done extremely well by today’s standards. Remember when a gold record award celebrated sales of CDs, tapes or vinyl of over 500,000 units and the platinum of one-million? Now, a 50,000 seller is cause for…well, some kind of celebration.

So, the Social Media Effect has made our own local universe of possibilities expansive, but compared with the expanding universe of Internet pages numbering over a trillion last summer, the world of social media has, in fact, shrunk-wrapped us all. One might find a distributed computer strategy handy to manage a social network that expanded beyond the known limits circumscribed by hosting and bandwidth on MySpace, Facebook, Bebo and other networks. Even for the media famous, there seems to be an organic eco-system dictating just how big social networks can grow—managing director of Garage Technology Ventures, Apple fellow, co-founder of Alltop, author and guru, Guy Kawasaki has 11,290 fans on his Facebook page. Still, if Guy decided to record and release an album to his "fans" and it sold through, he wouldn’t even make a dent on the Billboard charts.

What is happening has been best described just by the title of marketing genius, Seth Godin’s worthy and fun book, Small Is the New Big: and 183 Other Riffs, Rants, and Remarkable Business Ideas. It’s not necessarily a new idea—economist E.F. Schumacher’s 1973 book, Small Is Beautiful is not only a seminal text in its call for sustainable development, but in its advocacy of the small. A core idea growing out of the author’s study of village-based economics is its assertion of “Buddhist economics” summarized in Wikipedia: "[A modern economist] is used to measuring the 'standard of living' by the amount of annual consumption, assuming all the time that a man who consumes more is 'better off' than a man who consumes less. A Buddhist economist would consider this approach excessively irrational: since consumption is merely a means to human well-being, the aim should be to obtain the maximum of well-being with the minimum of consumption…The less toil there is, the more time and strength is left for artistic creativity. Modern economics, on the other hand, considers consumption to be the sole end and purpose of all economic activity."

The notion of the small having a far reaching impact goes back in Western culture to the Butterfly Effect theory first described in 1890, later in a 1952 Ray Bradbury short story about time travel (informing the phenomenon known as time paradox), and made popular by Edward Lorenz as part of Chaos Theory and his study of computer models of weather prediction.

Lorenz published his work in 1963 and presented it at the New York Academy of Sciences where "One meteorologist remarked that if the theory were correct, one flap of a seagull's wings could change the course of weather forever." The seagull was eventually replaced with the poetry of the butterfly and at an American Association for the Advancement of Science in 1972, Philip Merilees concocted a title for Lorenz’s paper as: “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” When we update our social network status, we are flapping our virtual wings.

Social networks are an organic growth the Web’s key value proposition of personalizing the world. The expansive potential of one’s personal social network is more akin to the relationship of the microcosmic—in this case, the personality—and the personal macrocosmic—or its potential reach to scores, hundreds, if not thousands of friends who then orbit our individual pages waiting for the next transmission of what we are doing, thinking, loving, sharing. But how deep can the quality and definition of “friend” be when it numbers in the thousands?

Charles and Ray Eames provided a stunning, classic video and book version of the relationship between the micro and the macro in their Powers of Ten. But humans need the comfort of scale in the form of intimate relationships, family, sports teams, fan sites, personal networks, and other tribal affiliations. The stars may be “all connected to the brain” as Pete Townshend once wrote in “Pure and Easy”, but sometimes the sprawl of thousands of stars on a clear night can present a canvas that is daunting as hell and not heaven in its infinite possibilities beyond human comprehension. We humans also have a tendency to get lost in a crowd.

Or as Elias Canetti once observed in his Crowds and Power, a fire sometimes has more power to unify a theater than a play can. The individual has the power to understand "the play is the thing" as a primary experience. But the natural force of fire has an elemental power that everyone understands with his or her reptilian brain—it’s fight or flight time, baby and ain’t no time to think about it when the whole shithouse is going up in flames.

Social Media theory posits that the group mind and crowd actually has the power to think and maybe even think better than one lowly Mensa member. But Canetti also said that the ultimate crowd may be the tribal pack of spermatozoa out of which, only one, has enough fame potential to survive the swim upstream to party down with the egg. So, let's send out the smoke signal--Social Media needs its own Darwin to sort out the details of who will best survive.

Friday, February 20, 2009

IS PERSONALIZATION REALLY THAT "PERSONAL"?



Do you ever wonder what the meaning is behind the words that we use everyday? I admit that I’m a geek when it comes to etymology. My fetish is word origins and especially tracking down the roots of words that we just toss off, often without thinking much about them. I like to rustle through the OED and various etymological dictionaries, lexicons of slang, clichés, and the like at random, just to see what turns up.

The Internet has been widely acclaimed as possibly the greatest social transformer since Gutenberg’s invention of the printing press and moveable type. Among other things, the Web has made community, interactivity, and personalization standard features if not demands, and even requirements of contemporary life—at least for many of the billion people who are now online.

I’ve read a lot about social media in the last year—whether in The Economist or in such books as Wired writer, Jeff Howe’s Crowd Sourcing: Why The Power of the Crowd is Driving the Future of Business, Clay Shirky’s classic Here Comes Everybody: The Power of Organizing Without Organizations , and John Clippinger’s A Crowd of One: The Future of Individual Identity . Now the power of group think and action is not new as Howe points out. Early hunter societies quickly learned that two brains—or at least two atl atls—were better than one (a basic meaning of “crowd sourcing”), in striking down Pleistocene prey. Even though all of these books are about leveraging the many, they have made me think about what “personalization”of the individual—in the context of the Internet and technology—really means.

MySpace, iPhone, YouTube. It’s all about the individual one might think at first blush. Ostensibly, “personalization” means customizing features to suit an individual’s taste and style. But, are we really being bamboozled a bit here? When you’re setting up your Facebook or LinkedIn profiles, for example, aren’t you being crammed and compartmentalized into convenient categories of somewhat generalized interest? I mean, netvibes and other RSS aggregators offer the convenience of creating a semblance of your very own newsstand. Maybe there’s a precedent even in print media for The New York Times masthead still announces, “All The News That’s Fit To Print”, which some cynical, if insightful soul once suggested should really read, “All The News That Fits.” But, at the end of the day, isn’t a lot of information being left out for the sake of making it all fit—whether it’s in The New York Times or our social network profiles?

When you look up “personalization” in The Barnhart Dictionary of Etymology, you are guided to its root in the word, “personal”. The use of the word apparently goes back to before 1387 when it was borrowed from the Old French word, “personel”, which came from the Latin word “persona”, which we are told meant to describe “a person”. More interesting is that its use to describe individuality or a distinctive character, was first recorded in 1795. Before the tide of European and American revolutions, which occurred just prior to that time the only individuals of note were generally monarchs and the royal classes who worked for them. Otherwise, there were the great masses or “commons”.

Even in the field of astrology, natal readings for individuals--excepting monarchs and royals--were relatively unknown prior to the 18th century. Mundane Astrology, as it was called, was the province of figuring out the future for countries and rulers, but the Average Joe was of little consequence in the prognostications of court astrologers. The rise of the individual, then, may be echoed in the actual need for the word “personal” to describe something more than just “a person”.

So, in describing ourselves within the social network “city limits” of a profile page, something has to go. Clippinger’s book provides a perspective from "social physics" with a debt to anthropology and sociology that says we are defined as individuals, in part, by our desire to be part of the crowd—and by what is reflected back to us by others and what they think.

It reminds me of something John Densmore, the drummer of The Doors, said to me once when I asked him shortly after the release of Oliver Stone’s biopic, what he thought of the movie. Given the troubled production during which the three “surviving” members of the band were all consultants—and then decided to bail “due to creative differences” with the director—John was quite diplomatic. “Well,” he answered, “I guess when they make a movie of your life in two-and-a-half hours, they’ve got to leave something out…”

Maybe when you are trying to personalize a medium that is far more than a mass medium—arguably the first truly global medium, you don’t want to design a network that will unravel out of accommodating too much uniqueness or the truly customized. Are we then losing anything of our originality in the process of being conscripted by the need for interactivity and community socialization that the Web indulges and has made de rigeur?

Jeremy Rifkin described to me how the Baby Boomers’ parents were the last generation who had a historical frame of reference—in other words, they defined themselves by looking back at World War II and the Great Depression. By contrast, Jeremy said that starting with the Boomers, the generations following were all defined by the Self and self-reference. The Boomers and those to follow are all “therapeutic generations”.

Western Psychiatry and Tibetan Buddhism would say that the Battle of Ego is one that we all face as human beings. In this Battle, we are thrown into an ongoing war that in essence seeks a balance of power between a healthy sense of self and the egoistic behavior at the root of neurosis and psychosis that damages others and therefore, ultimately ourselves. Who knows that the Web is now providing us with a playground where we will lose the Battle as our personal identities become branded by misleading marketing prefixes like “My” and “i” or by fitting ourselves neatly onto a profile page.

While Salvador Dali once remarked, “Perfection is to get lost!”, I don’t think we should give up the ghost without a good fight because technology undoubtedly brings with it benefits and progress, but when machines create efficiencies for us, what do we lose in the process? Is there another kind of "identity theft" at work here? There is no free lunch when we are not only consumers, but what is consumed.