Thursday, November 26, 2009

FINDING THE CENTER


Where are you? It seems like a simple enough question at first. I’m here writing at home. I’m at work. I’m in a meeting. In LA. In the USA. In my hometown at First and Main. And if in doubt, there’s always GPS. But, are any or all of these answers really accurate ways of describing place in space/time? We may have come a long way locating ourselves in the last several centuries. Amazingly enough, it had been in estimated in 1740 that there were as few as 120 countries that were actually mapped in the entire world according to David Grann in “The Lost City of Z.” That left enormous regions of its landmass labeled with the captivating, but elusive description as “unexplored.”

Not only were there famously areas of the sea that were illuminated on early maps with dragons, sea serpents, and other monsters, but most people—whether navigators or just ordinary landlubbers—believed that if you ventured too far into the unknown, you would fall off of the edge of the world. The invention of the chronometer in 1773 did a lot to help maps along by providing the key measure of time.

Prior to that, the so-called Age of Discovery and Exploration may not have seemed so grand to the “discovered” tribal peoples who were doing quite well, thank you, without the intrusion of the conquering, gold and spice seeking boat peoples. It did much, however, to cure the notion that you’d fall off of a flat earth into the abyss. But, it wasn’t until the expansion of the British Empire in the 19th century that the lands and wild countries of the planet were finally mapped.

Google Earth has recently “mapped” areas that persisted as “unknowns”—like an area the size of Texas described in the best-selling, “Morning of the Magicians”, between Amazonian tributaries, the Xingu and Tapajos Rivers that was unmapped going into the 1950s—or the infamous location of Area 51, the top secret test facility whose existence has been denied by the US Air Force, now “mapped” using handy Soviet Spy photographs. Still, our recent sense of security in thinking that we know where we are with such tools as Google Earth and GPS may turn out to be false.

A recent NY Times article about a new scientific study indicated that the increased use of GPS is having a deleterious impact on the spatial abilities of humans. It may be one more example of how McLuhan’s celebrated perspective that saw media and technology as “extensions of man” extending the senses, are having quite the opposite effect. Could it be that sensory systems developed thousands of years ago are being deadened by the use of technology?

The difference between men and women’s sense of direction, for example, is the subject of many jokes and popular culture. Evolutionary biologists offer theories that there may be a basis in truth to differing navigational abilities of the sexes. They have theorized that women--who were stuck back in the cave raising clans and cooking--did not need to develop the same spatial acuity as their hirsute, club carrying mates who needed to remember geographical features for hunting purposes.

Our world--where we may be losing our compass, so to speak--is sharply contrasted by the tribal world. Take for example, the American Plains Indian ceremony known as the “smoke sacrifice.” First of all, the so-called “peace pipe” was the real deal. The tobacco that filled those pipes was the uncured variety, a highly potent hallucinogen. This is precisely why it spawned so many peace treaties that were not remembered by white men in the hungover hazy light of the following day. It is also the reason why tobacco was used ceremonially by native smokers in contrast to its convenient, addictive commercial form today.

The Smoke Sacrifice is illustrative of an orientation in the world and cosmos that tribal peoples have shared throughout time. The cardinal directions were not just points on the compass, but sacred points of origin which were associated with spirits, colors, animals, and other significators that defined the human world as allies. Anthropologist Dennis Tedlock has aptly called this orientation, specifically with respect to the Maya as “finding the center.”

The first puff of smoke that a modern cigarette, cigar or pipe smoker takes is usually pulled greedily into the mouth and lungs. The Smoke Sacrifice is initiated when the smoker blows six puffs of smoke in an offering toward the cardinal points on the compass and to the axis described by the nadir and the midheaven which runs up the spinal column. When this has been accomplished, the final puff is inhaled, thereby completing the ceremony by establishing the smoker as the center of the universe circumscribed by the sphere of smoke. In effect, one’s heart becomes that nodal, essential point of being. The ceremonial smoker is centered by paying respect to the sacred directions and is ready to move out into greater world of creation that always surrounds us wherever we are. Now that is GPS!

So, the next time someone asks, “Where are you?”, think on how many responses a truly accurate answer might have—because even without the smoke, the heart is our only guide in finding the center which is everywhere that we find consciousness.

Saturday, September 5, 2009

THE DAY THE WORLD ENDED


Last month, while American consumers were being prepped and primed by media outlets and marketers for the 40th Anniversary of the Woodstock Festival, another anniversary was taking place at the beginning of August. I don’t want to be some kind of killjoy and take away from the nostalgic image blitz of stoned-out youth frolicking in the New England mud bathed in the electric rain of rock gods and demigods, many of whom I still worship. But, as Frank Zappa once said to me, “The world will end in nostalgia,” so it seems logical that I can’t get thoughts of August 6, 1945 out of my mind. Maybe it’s because the hills above my house have been on fire for the last several weeks, raining down ash and producing atmospheric conditions around us that resemble the smoky, yellow eclipse lit haze of some other planet. Driving back home from San Diego, I could see the giant mushroom cloud pluming over Pasadena from over a hundred miles away.

At 8:15 AM on that August morning 64 years ago, a B-29 bomber dropped a single-bomb with the charming nickname “Little Boy” over the Japanese city of Hiroshima. On August 9, a second bomb called “Fat Man” was dropped on the city of Nagasaki. These particular targets were chosen according to Stephen M. Younger in his book, "The Bomb: A New History", because they “had not suffered from the devastating bombing raids that had reduced Tokyo and other cities to little more than smoldering ruins. The hills that surrounded Hiroshima and Nagasaki would focus the effect of the blast, further increasing the destruction caused by the bombs.”

“Little Boy” dropped in forty-three seconds to nineteen-hundred feet above Hiroshima and exploded. The height had been chosen “to maximize the damage produced by the expanding nuclear fireball.” Its detonation created an intense flash that was called “brighter than a thousand suns.” Within seconds, an immense shock wave and firestorm swept the city destroying everything in its wake including some 68,000 buildings. Three days later, as Younger reports, “the United States demonstrated to Japan and the world that Hiroshima was not a one-off event” when it completely destroyed Nagasaki with a second atom bomb.

I heard a recent, breathless radio promo for a show called “Surviving Disaster” on Spike TV that described it this way—you have 20 seconds to cover your eyes and about 20 minutes to take cover from radioactive fallout. The promo ponderously warned, “It’s not a question of whether it will happen, but when.”

Just how the show is drawing the conclusion of inevitability is unclear, but the leap from the catastrophic events of Hiroshima and Nagasaki to nuclear attack as entertainment value is a mind-boggling, but not necessarily American impulse. Godzilla is not only an iconic film monster, but is held in Japan with an almost religious reverence. One reason why is that Godzilla, Rodan, and other Japanese monster movies have been seen as a symbolic, subliminal response, inspired by the Japanese experience with the atomic attacks. Godzilla awakes in the film as the result of French nuclear tests in the Pacific. Despite its grave delivery, the Spike promo is completely removed from the terrible reality of what actually occurred in August, 1945.

Estimates for the death toll from both bombings has been estimated at well over one-hundred times the casualties from the 9/11 attacks. This figure includes an estimated from sixty-six-thousand to one-hundred-forty thousand instant deaths in Hiroshima and an estimated forty-thousand in Nagasaki. We know that in the immediate five years following, one-quarter of a million more died with untold hundreds of thousands more in the decades following the bombings from radioactive related diseases. But statistics remove us from the human factor of disaster and the horror of Hiroshima and Nagasaki is beyond human imagination.

Hiroshima has been called “the exclamation point of the twentieth century”, but two perspectives from survivors are more than enough to tattoo the pictures forever in one’s brain. Stephanie Cooke tells of one in her recent book, "In Mortal Hands: A Cautionary History of the Nuclear Age": “…a nineteen-year-old girl who survived reported a remarkable sight near a public garden. Amid the bodily remains, burned black and immobilized at the moment of impact, there was, she said, ‘a charred body of a woman standing frozen in a running posture with one leg lifted and her baby tightly clutched in her arms.’”

In "Uranium: War, Energy and the Rock That Shaped the World", Tom Zoellner describes how Japanese writer, Yoko Ota, remembers the white flash as “the collapse of the earth which it was said would take place at the end of the world.”

Zoellner continues, “Even President Truman, who was famously coolheaded about the decision to use the weapon on Japan, wondered in his diary if the act he would soon authorize was ‘the fire destruction prophesied in the Euphrates Valley Era, after Noah and his fabulous ark.’"

When news of the successful atom bombing of Hiroshima reached the team of scientists behind its invention in Los Alamos, New Mexico, “there was a general excitement, and scientists rushed to book tables at Santa Fe’s best restaurant to celebrate the achievement. But that night’s party on the mesa was a grim affair. Almost nobody danced, and people sat in quiet conversation, discussing the damage reports on the other side of the world. When J. Robert Oppenheimer left the party, he saw one of his colleagues—cold sober—vomiting in the bushes.”

The decision to drop the atom bomb is a controversy that will remain unsettled and is examined at length by Richard Rhodes in his books about the nuclear age. One school of thought is that the Japanese doctrine of "defense at all costs" was a bluff; another indicates that they had already expressed a willingness to negotiate a cease-fire through Russian back channels. According to the Russians, the atom bomb was secondary and it was the declaration of war against Japan by Moscow that was the deciding factor in ending of the war.

In her illuminating book, "Troubled Apologies: Among Japan, Korea, and the United States", Alexis Dudden describes both US media censorship and outright fabrication about the bombing of Japan as propelling “the basic story line for Hiroshima and Nagasaki that Americans would come to cling to as history at the cost of learning what was actually going on: ‘the bombs saved lives.’…the US government and its officially placed mouthpiece at the New York Times established as a fact that no one in Hiroshima had died from radiation and that only foreign lies (British or Japanese) suggested otherwise.”

The New York Times’ science writer who was an eyewitness over Nagasaki, William “Atomic Bill” Laurence, won a Pulitzer Prize for his early, evangelical coverage of atomic weapons. His account of the event demonstrates that he was not only distant from the event by mere altitude, but close to some kind of atomic rapture:

“Being close to it and watching it as it was being fashioned into a living thing so exquisitely shaped that any sculptor would be proud to have created it, one felt oneself in the presence of the supernatural…Awe-struck, we watched it shoot upward like a meteor coming from earth instead of from outer space, becoming ever more alive as it climbed skyward through the white clouds. It was no longer smoke, or dust, or even a cloud of fire. It was a living thing, a new species of being, born right before our incredulous eyes.”

Historically, not everyone was sold. Harvard physicist George B. Kistiakowsky witnessed the Trinity test in July, 1945 only several weeks before the atomic bombing raid on Japan and called it, “the nearest thing to Doomsday that one could possible imagine. I am sure that at the end of the world—in the last millisecond—the last man will see what we have just seen.”

John Hersey was the first to write of the human factor in his long August 1946 New Yorker essay profiling regular people on the ground in Hiroshima. The day after the Trinity test sixty-eight scientists at the University of Chicago signed a confidential letter to Harry Truman urging him not to use the device. They wrote presciently: “If after the war a situation is allowed to develop in the world which permits rival powers to be in uncontrolled possession of this new means of destruction, the cities of the United States as well as the cities of other nations will be in continuous danger of sudden annihilation.”

Leo Szilard, the scientist who persuaded his colleagues to write the letter, and the man who conceived of the chain reaction and worked on the Manhattan Project later referred to himself and other atomic scientists as “mass murderers.”

“Why did they have to go and drop another?" a wife of one of the atomic scientists asked upon hearing the news of Nagasaki. “The first one would have finished the war off.” Short of an apology, this kind of self-reflection on the part of civilians as well as the scientists—including Einstein—who were behind the creation of the Atomic Era, leads one to wonder what we can do to make amends today.

There is a long tradition—if not ritual—of apology in Asian cultures. It is one that seems to have been adopted for some time by Americans, who are now accustomed to press conference scenes where morally straying politicians apologize to the nation, their constituents, wives, and families for errant behavior. More recently, other kinds of less predictable apologies have appeared.

Last February, the Senate apologized to Native Americans for atrocities committed during the opening and seizing of their lands. On July 29, the US House of Representatives issued a resolution formally apologizing to black Americans for slavery one-hundred forty years after its abolition. After forty years of silence, at a local Columbus, Georgia Kiwanis Club on August 21, Lt. William Calley (the only Army officer convicted of the 1968 My Lai massacre), in an extraordinary and unexpected apology, expressed his “remorse for the Vietnamese who were killed, for their families, for the American soldiers involved and their families. I am very sorry."

Twenty years ago, Congress apologized for the World War II interning of Japanese Americans in concentration camps. Why not Hiroshima? None other than the author of "Giving: How Each of Us Can Change the World" said the following in 1995: “The United States owes no apology to Japan for having dropped the atomic bombs on Hiroshima and Nagasaki.” Bill Clinton was just towing a bipartisan line.

Alexis Dudden traveled with President George W. Bush on a trip to Tokyo during 2002 on a mission, among other things, to thank the Japanese government for its support of the War on Terror and to launch plans to celebrate the 150-year anniversary of Japanese-American relations. Dudden relates that she “had an unexpected, theatrical education in one of the trajectories of Hiroshima’s history during (a) routine walk.” As she passed down the main boulevard near the Japanese Parliament and National Library, “several of the notorious black trucks popular with the country’s extreme right wing passed…with the lead van blaring the customary martial songs. This was not unusual, but the message pouring from the loud speakers stopped me flat—‘Welcome to Japan, President Bush of the United States of America! Apologize for Hiroshima and enjoy your stay!’”

She goes on to say, “Throughout the recent era of apologies all around—or maybe in spite of it—there has remained one matter on which Washington holds firm, regardless of who is in office—there will be no apology for Hiroshima or Nagasaki.”

Originally, the lowest projection of how many American lives would have been saved by avoiding a costly land invasion of Japan by using the bomb was twenty-six thousand casualties. Dudden observes, “Americans transferred what happened—the destruction of Hiroshima and Nagasaki—for an event that never took place—the proposed land invasion of Japan—to stand in for history. By the early 1950s, the imagined truth was American myth, and in 1959, President Truman wrote for the record that the bombs spared 'half a million' American lives, and that he 'never lost any sleep over the decision.' Over the years…American storytelling has come to count the number of ‘saved’ Americans as high as 1 million. (This number appeared squarely in David McCullough’s 1993 Pulitzer Prize winning biography, Truman, despite abundant evidence to the contrary at the time.)”

Apologies are more complicated than they appear. Dudden’s book details the way that formal apologies can be used to cloak deeper strategy to avoid restitution and financial penalties. As to the US government’s obdurate stance, Dudden concludes, “The chronic inability to confront how America’s use of nuclear weapons against Japanese people in 1945 might constitute the kind of history for which survivors would seek an apology, let alone why the use of such weapons might represent a crime against humanity, is sustained by Washington’s determination to maintain these weapons as the once and future legitimate tools of the national arsenal. It is not at all by chance that among weapons of mass destruction—nuclear, chemical, and biological—only nuclear weapons are not prohibited by international law. Were it otherwise, the likelihood that the history of America’s use of them on Japan would generate changes of attempted genocide against the United States or Harry Truman would increase exponentially.”

Last April, President Obama made strong statements during a visit to Prague about his commitment to abolish nuclear weapons. His speech called for an international summit on the subject by the end of the year. The Mayor of Hiroshima, Tadatoshi Akiba said, “As the only nuclear power to have used a nuclear weapon, the United States has a moral responsibility to act, defining U.S. responsibility in a historical context." Akiba asked Obama to hold the summit in Hiroshima.

Our family had a Japanese exchange student staying with us for a year. After she had been living with us for a while, I felt compelled to speak with her about her hometown of Hiroshima and to apologize in my own way for what had happened before either of us had been born. She seemed surprised at my gesture and we spoke about the event in the abstract—her parents had been children at the time and spoke little to her, if at all, about their memories. I suppose that the American generation preceding mine who experienced the direct consequences of World War II might argue with my stance on apology citing my distance from the events that defined them, in many cases, for the rest of their lives.

I find it interesting that the word “apology” and “apocalypse” have the same prefix. Apology is said to be rooted in words originally meaning “regret, defense, or justification” and giving an account or story of oneself. Apocalypse is rooted in the Latin word meaning “revelation” and the Ancient Greek meaning “to uncover” as in to lift a veil. The prefix “apo” means “from, away, off”. Perhaps there is a connection between the act of apologizing and the avoiding of apocalypse—by this logic, if we lift the veil that hides our own truth, then revelation might follow. A year after the bombing of Hiroshima and Nagasaki, the esteemed Indian yogi, Paramahansa Yogananda, reflected on the discovery of uranium: “The human mind can and must liberate within itself energies greater than those within stones and metals, lest the material atomic giant, newly unleashed, turn on the world in mindless destruction.”

With the first new ruling party now established in Japan in over fifty years, an appropriate overture to the new government from the American President whose campaign mantra was “change” should be to agree to hold the Nuclear Non-Proliferation Treaty Review Conference in Hiroshima and take the world stage by opening his remarks with an apology to the people of Japan. Think about it the next time you are being served sushi—these people were our enemies? Or as Allen Ginsberg might say, “We are the Japan.”

Friday, June 12, 2009

IT'S LIKE, TRULY AWESOME, DUDE!


Maybe we should give the word “awesome” some rest. I mean, is there anyone else out there who thinks that the word “awesome” is becoming a tad overused? This was all brought into high relief during a recent visit to Starbucks. My tendering of exact change was met by a hearty “awesome!” from the barista. Shortly after, I heard about the first twitter from space—one of the astronauts simply messaged, “Launch was awesome!” Well, there are literally miles apart in the way that this exhausted exclamation was used in both cases and I think it’s probably quite telling.

Afterwards, I happened to be speaking with Peter Merlin, who is the base archivist at Edwards Air Force Base and works for NASA. I mentioned the launch tweet and said, “Now, you guys really do know the meaning of ‘awesome’—take those Hubble images peering back to the edge of time some 13.4 billion years and almost to the Big Bang—now that’s a truly awesome feat, dude.” He’d just seen the space shuttle take off from Edwards piggybacked on top of its tricked-out 747 ride, and for anyone who’s ever seen the shuttle take off or land—well, you know that it is an experience that dwarfs one’s self—and is truly an awesome sight to behold.

How did a surfer’s exclamation like “awesome” become today’s unequivocal, number one superlative? This is a word that was traditionally reserved for approximating the transcendental—historically, its use is far from common because it has been generally used as a description of the indescribable—the mystical, psychedelic experiences, seeing into other worlds, the streaking of a UFO over Roswell, New Mexico, the vast impact crater of a meteor near Winslow Arizona, the majestic heights of the Himalayas, perhaps the mile deep expanse of the Grand Canyon or merely the jaw-dropping sight of contemplating infinite space in the sprawl of stars in a clear night sky.

No matter, the adaptation and morphing of words into popular culture provides a fascinating window into the evolution of language, class structure, and the evolution or devolution of consciousness. Let’s not only take it out on poor “awesome.” There are some other offenders that are equally annoying--and revealing. Take for example, the word “like”, another vastly overused, overwrought word that now has multiple uses beyond its original sense and some that no longer mean anything at all. That’s where meaning gets really interesting for my money.

Casual eavesdropping at a high school, local mall or watching any reality or celebrity talk show will reveal the word “like” abounding, crashing into consciousness in wave upon wave onto the shore of one's mind until it makes you submit to white noise like a jargonaut jingle—“It’s just like, well it’s sort of like unbelievable, you know—I’m like, well, he said to me that he’s not like, in love with me—like, not at all! And I’m like, if it’s not like, ‘love’, then is it like, a deep feeling, at least? Like it really hurts, you know. Like, oh, whatever!”

How did a word that was once charmed lower class Elizabethan popular theater goers when utilized discretely by the likes of William Shakespeare become the province of the stoned-out, 50's beatnik or hippie, watered down, safe for 60's TV version of the Maynard G. Krebs character in "The Many Loves of Dobie Gillis" or 70's version of Shaggy on "Scoobie Doo"?

In this latter usage, it has evolved into some kind of verbal placeholder to mark time while the speaker scrambles to find the words and articulate a response. No one is saying here that natural speech is the same as prepared remarks—we don’t expect that our daily conversations demonstrate the same stilted flow as those based on using handy teleprompters and speechwriters. As such, our everyday dialogues are filled with placeholders of one kind of another—“um” is probably most common example in the US and UK, and one that is profiled in the wonderful book, "Um. . .: Slips, Stumbles, and Verbal Blunders, and What They Mean" by Michael Erard.

Prior use of the word “like” (before it was appropriated by today’s teenagers), was traditionally as a preposition or as a figure of speech. In this use, it is still known as a “simile” and seeks to compare two unlike things in the reader or listener’s mind. Shakespeare used one when he lamented of the dead Julius Caesar, “Why, man, he doth bestride the narrow world like a Colossus.”

Perhaps what we are seeing in its new usage is the sad result of the modern mind becoming less capable of holding two separate ideas simultaneously. I’ve written before about the losses that can be associated with technology and its influence on behavior such as personalization.

Multi-tasking, much ballyhooed as an emergent asset of so-called millennials and other techno-savvy youth, also must have a downside if it is just a downsizing of the attention span of the user. And it’s not just restricted to exuberant digerati. Studies of cell phone use among drivers have shown that accidents are far more likely when activities are combined—in the case of men, they are 40% more likely to have an accident, and women 60% more likely to have a mishap—one of the reasons that many states now have laws about texting and handheld calling while driving.

Shortly after beginning to write this post, I bumped into Barry Sanders, author of the classic popular study of literacy and media, "A Is for Ox: The Collapse of Literacy and the Rise of Violence in an Electronic Age". His book provides an in-depth look at what we are losing as a society as we struggle to extract ourselves from the La Brea Tar Pits of daily media immersion. I told Barry that I’d recently become obsessed with the “awesome” phenomenon and was busy exorcizing myself through a blog post. He cheerily replied, “Oh, ‘awesome’ is a word from my century—the 14th!”

Consideration of the word’s origin in the distant past does shed some light on how far we’ve come in the evolution of the word. Actually, the first historically referenced use is probably before 1300 in “Arthour and Merlin” which was developed from an earlier age, when it was used around 1250 in “The Story of Genesis and Exodus”. Apparently, it had been borrowed from the Scandinavian, aghe or the Old Icelandic agi, both meaning “fear”. Interestingly, the Greek achos is similar and means “distress and pain”. Discovering its etymology, I was beginning to lessen my angst by hearing of its reference to pain. By Barry’s era in the 1400's, the Middle English word had spawned “aweful”, meaning “to inspire fear and terror” and by 1598, we see the first use of “awesome.” I was finally feeling like I was getting home.

But in 1997, I left my home in Hollywood and started working for a Bay Area startup. It was then that I was introduced to another word I hadn’t heard since watching Kookie Burns on “77 Sunset Strip”—the word was “cool”. Again, it’s another superlative and like so many slang terms and phrases that jump into youth culture—again, superlatives like “dope”, “tight”, “fat”—it comes primarily from African American jazz diction, where it originally described a genre as well as a state of artistry that audiences tried to mimic. "Juba To Jive", a dictionary of African American slang indicates that it’s been in use from 1650 or so, derived originally from Mandingo and entering into popular usage in England as early as the 1590's. By the 1930's, American gangsters and tramps were using it to refer to killing someone, or someone who was "a stiff". It also came to refer to a person who displayed “great self control”.

As the opposite of “hot”, it grew in the 1940's as an adjective to mean someone who was aloof, detached, unflappable and unflustered, but also a hip cat or chick who was as fashionable as the cool jazz, martini soundtrack of the times—as in “Cool, Daddy-o!” During the hippie era, it came to mean, “I don’t have any dope on me, officer!” as in “”I’m cool” or in the way that the Dennis Hopper character tries to reassure Martin Sheen’s in "Apocalypse Now" that Col. Kurtz (as portrayed by Marlon Brando), is “cool” despite the fact that his jungle camp is festooned willy-nilly with decapitated bodies and chopped off heads.

Perhaps more telling than common superlatives is another word in the commons, particularly among teenage vocabulary—“whatever”. Kind of the contemporary version of what The Sex Pistols once described in the chorus of “God Save The Queen” as “No future”, it takes the placeholder status of “like” one step further and can be seen as the ultimate dodge of having an opinion one way or the other. Like the color black that is favored by punks and inspired by the same boutique as The Sex Pistols, it is an expression of no color or opinion. What we may be bearing witness to is a media saturated generation with expanded options, information overload served in snack size bytes, but having no real power of choice.

In a fascinating piece by media writer, Paul Parton called "The Consumer: Adjusting To Internet Time", which appeared in Mediapost in 2007, he points us to what may be the crux of the matter: “Now, with the Web, there’s often no lag time between stimulus and response…an entirely new dynamic with significant implications for the way we create marketing communications and build brands…It’s advertising’s ‘butterfly effect’.”

The point is that previously, short attention span was a bad thing for building connections or creating a “cultural bond” between brands and consumers. Now, with Amazon's one-click, instantaneous purchasing app, impulse buying—long the province of convenience store counters and infomercials only—is now the industry standard. So, if advertising traditionally relied on what Robert Heath wrote in "The Hidden Power of Advertising" as “implicit” or long term memory, where brand messages work over time in the subconscious—is advertising now dead? That may be a subject for debate or at least a future post, but in this case, what is important is that, if the loop is closed so tightly in time, what becomes of the decision process if there is no lag time between stimulus and response?

What you get is not only “cool”, but another phrase I first heard in Silicon Valley—“cool stuff”. What exactly is “stuff” anyway? It’s in common use as a sort of come-on—as in “and other cool stuff!” As such, it sounds somewhat similar to “whatever”—just more…things of some non-descript, description. “Stuff” originally meant the kind of quilted material that was placed underneath chain mail in the Middle Ages—hence, it’s slang use as someone who is described as a “stuffed shirt”.

In 2005, there were a number of results published by scientists for experiments that showed human beings were still evolving. In particular, genetic research has led to conclusions that homo sapiens' brains have added or “selected” versions of genes over time that may have influenced cognition, and therefore, changes in capability of making what Terence McKenna once called, “organized mouth noises” or language.

But, as Christine Kenneally says in “The First Word: The Search for the Origins of Language,”: “Not all change is good. As much as language enables us to control nature and keep our environments stable, it also makes possible the dramatic altering of our environment in unexpected and dangerous ways. The same language skills that promote technological innovations like water irrigation, road building, and air conditioning also produce the ozone destroying pollution and countless other ecological dangers of the modern age. Any of these phenomena could result in a sharp left turn for the human genome. And perhaps the same linguistic skills that give us science and currently some control over DNA, will lead to our own extinction in less obvious ways. Language and material culture have greatly increased the mobility of the world’s population, and some researchers believe that this will lead to an unhealthy and irreversible diminishing of variation in our genome. As more and more humans breed across the boundaries of genetic variation, we become a blander, more homogenous bunch than our diverse parent groups. This could be a problem…for the more we are the same, the easier it is for one single thing to make us extinct.”

Perhaps we are beginning to see this homogenization factor in language. Is understanding a brand logo the same as identifying with a clan symbol or totem? What is missing in the former is a connection to personal as opposed to corporate history. The latter fulfills the need for story in the myths of tribes and cultures. Maybe that’s one of the reasons that marketers have been speaking for the last several years about a brand telling a story and emphasizing narrative in testimonials and association with celebrities whose endorsements come with their own attendant PR mythologies.

We live in dangerous times for not only our environment, but language, too—when text becomes a verb as in “texting” and now ‘twitter”. The June 11 edition of Daily Finance, just announced that “Twitter Breaks the Verb Barrier”:

“It may be reducing us all to a bunch of semi-literate, hashtag-spewing teenagers, but Twitter's influence on the English language can't be ignored, in the view of the Associated Press. The microblogging service has reached a new milestone, earning a promotion from noun to verb in the new edition of the AP Stylebook.

That means AP writers (and others who observe the news collective's style guidelines) can now, without shame or censure, use the phrase "to Twitter" in place of the wordier "to post a Twitter update." Tweet, the preferred term for a Twitter post, also works as a verb, per AP. The timing is appropriate, coming a day after the Global Language Monitor declared "Web 2.0" the millionth word to enter the English language.

Think having your brand name recognized as a verb isn't a big deal? Tell that to Microsoft, which chose the name "Bing" for its revamped search engine in part because it thinks the moniker will "verb up" in the manner of Google.”

Is complexity of language a sign of evolution or devolution? Latin, a highly complex tongue, has long been considered a “dead language”. Many languages are going extinct as discussed in an earlier post. With some six thousand languages remaining on Earth, about half of the world’s population speaks only ten of them with English the most dominant one.

According to Kenneally, every two weeks, the world loses another language. She observes, “When a language dies, we lost the knowledge that was encoded in it. Though we assume that when knowledge is lost, it has been superseded by a superior version, a dead language, with all its unique ways of carving up the world, is as irreplaceable as the dodo or Tyrannosaurus rex. Unfortunately, even if we, and our languages, are still evolving, we still don’t know where we’re heading.”

The retired professor of physics at Princeton’s Institute of Advanced Study, Freeman Dyson, is more optimistic: “When teenagers become as fluent in the language of genomes as they are today in the language of blogs, they will be designing and growing all kinds of works for fun and profit.” Now, that's a scary notion which gets me right back to the original meaning of “awesome”—what a terror inspiring concept!

If we no longer know the origins of words that we are using continuously, almost automatically, is something already extinct in us? And do words lose their new meaning if they are beaten to death? If everything is “awesome”, then what words can we use to describe the truly awesome? But, things could be really bad—I mean, we could end up like the Hopi people who have no tenses because they have no word for the concept of "time". Does having a really short attention span lead to time travel or living in the moment of instant gratification?

Seriously though, the Hopi were onto something truly awesome in what the linguist Benjamin Whorf called “Hopi Time” as are the Inuit people who famously have scores of words to describe what we civilized folks shrink wrap to “snow”. Who are the primitives now? Are we downsizing not only our vocabularies, but our consciousness by shrink wrapping language for convenience sake? Where is the expansive, future hybrid lexicon of the youth tribes in "A Clockwork Orange" when you need it? Maybe a reference to this tribal linguistic superiority can motivate us to dig down into our own war chest and find some new superlatives or perhaps we can just go back to using the word “great” or some other expression of exclamation. Now that would be truly awesome, dude...

Wednesday, May 27, 2009

HOW TO THINK INSIDE THE PYRAMID REDUX


For those readers who have been following the continuing, late breaking story from Old Kingdom Ancient Egypt of nearly 5000 years ago, I was recently graced and flattered by an email from the man who solved the great mystery of how the Great Pyramid was built:

Hi Kevin,

I'm Jean-Pierre Houdin and I'm very pleased with what you wrote about my theory...
HOW TO THINK INSIDE THE PYRAMID
Thank you...

I don't know if you had the opportunity to watch the NatGeo USA documentary about my work?

You should watch it:

http://www.youtube.com/view_play_list?p=3442C0E0D8EA2A33

Or you have the BBC2 Timewatch version:

http://www.youtube.com/view_play_list?p=0E083435887644B5&search_query=pyramid+houdin

A French documentary was also edited last year and was broadcast in many European countries.

The Japanese television NHK will broadcast their own documentary in Japan in the coming months.

I've received hundreds and hundreds of e-mails from all over the world, all very positives and most of them with these remarks I picked up from your blog:

"The jury may still be out in terms of how traditional Egyptologists have reacted to Houdin's theory, but to me, the idea makes logical, if not just plain common sense".
.../...

"The logic of Jean-Pierre’s theory is transparent and struck me as a breakthrough. It just made sense".

Egyptians were rationalists...The way they built the large smooth pyramids of the 4th Dynasty...makes sense (their "know-how")...
And they were as smart as we pretend we are...45 centuries ago...
The question : "How the pyramids were built" is our problem, not their...They built the pyramids...Period.
But since 200 years, all the guys willing to explain the construction started from a unique wrong idea: OUTSIDE...
Their answers are wrong from line one because the base of the studies is wrong...
How can you explain something when you start wrong?

I didn't invented anything, I just understood HOW THEY BUILT THE PYRAMIDS...I'm an architect...and that helps...a little...

The guys who deserve something are our Egyptian Ancestors...

What impressive work they did...

You should have a look at:

www.3ds.com/khufu

and

www.construire-la-grande-pyramide.fr

If you want more information, feel free to ask.

Best regards

Jean-Pierre Houdin

Sunday, May 24, 2009

MY FIRST RECORD


Do you remember the first record you ever bought? Well, I guess that I’m already dating myself here by referencing vinyl—in this case, 45 rpm recordings. But, I was given solace recently on a field trip with my son to a restored Victorian home nearby. We were touring with a couple of other families with children about his age who were around 8 or 9, when we entered what in the 1890s was called the “salon” and what we know today as the living room. Inside, the importance of music and conversation were in clear evidence with numerous chairs, settees, a large couch, a Steinway square grand piano that had made it by boat from New York and around the tip of South America, and an Edison Wax Cylinder Phonograph.

When the docent started explaining how this device was used, one of the kids who was inspecting its parts rather intensely asked with a shrug, “Where does the CD go?” It made me smile, but also feel better because he hadn’t asked where you would click to get downloads. The technology and instruments that humans have developed to capture sound and its organized form that we know as "music" may change, but whatever the manner in which we first hear and understand its existence is only matched by the musical entity who introduces it to us. And in a way, that primary experience can say a lot about us as individuals as well as initiate a trajectory for our musical futures. The extra step that we take when we actually consume music as a purchaser—whether on vinyl or digital format—may also serve as a sort of musical version of carbon dating, since music is distinguished as an art that lives by and in time.

In my case, my first acquisition as a consumer was at the age of seven and was Chubby Checker’s “The Twist”, released in 1960. I don’t remember where I bought it or what led me to buy it in the first place—though, when in doubt about my early music history, I usually blame an appearance on “The Ed Sullivan Show”. But, I still remember that it was on Cameo Parkway Records and that the red and black label had an actual gold lined, cameo image of a refined lady in profile on it. This visual element is one that I grew to associate with music—and 45 labels were nothing once I graduated to albums, which arguably had already become an art form—if often a kitschy one—in their own right during the 1950s.

MTV added motion video to music, which to my mind often held the artist and audience hostage to a music video director and record label marketing department’s “vision” and “interpretation”. In the world of the download, we’ve now gone full circle. Once upon a time, it was the packaging that made opening a new record “album” like Christmas every time you went to the record store—even though you couldn't always judge a record by its cover. Now that has all but disappeared. More important, packaging was not only a marketing come-on, but also influenced music discovery.

I remember haunting my local record store as a teenager and seeing certain album covers that lured—and even frightened me into buying them. When a friend told me in 1967 that there was this band from England that were louder than The Mothers of Invention, I went and asked for “Are You Experienced”. When I saw the cover, adorned by a leering trio splashed in psychedelic finery, beckoning out of a fisheye lens with a look that dared me to enter—I had to think twice, but am forever glad that I didn’t hesitate too long. The first several bars of the opening song actually flicked a navigational switch on in my brain that has been setting a course for the heart of the sun ever since. It was also a record that was to rear its surrealistic head in a book I did with rock critic, Dave Marsh, who I collaborated with on the long out-of-print, The Book of Rock Lists published by Dell and Rolling Stone in 1981.

Part of the book included a year-by-year breakdown called “Top of the Pops” which codified our own version of the Top Forty Hits in Rock and Roll from 1955 to 1979 and we described as, “For the authors, one of the great incentives in a project like The Book of Rock Lists is the opportunity to inflict on the unsuspecting reader personal opinions about the greatest and most essential records of all time.”

Another series of lists set about codifying the greatest “Top 40 Chartmakers” or forty albums from each year (beginning in 1963) that made Billboard’s Top 100 Chart. There were many conversations about ranking these records. But, when we came to the seminal year of 1967, we had a lot to consider with The Beatles’ “Sgt. Pepper’s Lonely Hearts Club Band”, The Mothers of Invention “Absolutely Free”, “Otis Redding “Live In Europe”, “Fresh Cream”, The Doors’ debut album…and “Are You Experienced?” by The Jimi Hendrix Experience. Ultimately, Jimi won out, though in retrospect, “Sgt. Pepper” would seem the logical winner over time as classical rock music. No matter—it was all meant to provoke friendly debate just as Dave and I had experienced in its creation. And rock and roll doesn’t suffer academic treatment very well. I always cringe when I see it offered on some over-reaching college syllabus. It seems like the last nail in its coffin (see my earlier post, The Vampire Theory of Rock and Roll) to stuff Rock Music like some taxidermy object to gaze and wonder at, if not dissect for hidden meaning.

Another record that I saw on and off for months at my local record shop was Dr. John’s first record, “Gris Gris”. I was actually scared by the cover, which dripped with Voodoo talismans and trimmings and an image of a somewhat diabolical looking madman in shocks of red and green like some New Orleans Halloween hallucination. Somehow, I knew that if I bought that record, that my ears would never be the same. When I finally put my money down and listened to the spooky likes of “Croker Courtbullion” and “I Walk On Gilded Splinters”, this wasn’t just music—it was theater of the mind—and it was also introducing me not only to mojo roots, but to Mac Rebennack’s musical roots and opened up a lexicon from Huey Piano Smith and Duke Ellington to the weird, ethnographic swamp soup of Voodoo chants and Afro/Yoruba trance dance. Again, I’d never heard anything like it and my brain was imprinted with coordinates for future navigation to the Mississippi Delta and points east across the Atlantic and beyond to the so-called genre of “World Music.” It’s a journey I’ve been on ever since.

The other thing that Dr. John’s album came with was liner notes, a sub species of the album as an art or non-art form that is now all but disappeared with virtual music consumption. I’ve delighted in showing my daughter the liner notes from Bob Dylan’s many early albums written in his e.e.cummings mirror style of lower case, West 4th Street stream of consciousness. Dr. John’s liner notes were also written with a voice that echoed and added detail to the musical phantasmagoria within.

Liner notes were long established in the world of classical music and jazz, where the “seriousness” of the exercise inspired, no doubt, the necessity of anatomical dissection and explication. But, rock and roll was a late comer—I mean, what can you possibly dissect about “The Who Sell Out”, Never Mind the Bullocks, Here’s The Sex Pistols” or even “Sgt. Pepper” for that matter—but as the music developed a history and became more popular and recognized, the addition of liner notes made more sense depending on whether a musician could actually write or if an eager rock critic was available. A rare few, like “Freak Out” , provided a bonus map of an artist’s musical DNA. By citing his artistic influences at some length, Frank Zappa added to my future discoveries and not all were restricted to music.

How do digital music consumers discover new sounds today? The retail store has gone the way of the dinosaurs with the large chains going under from lack of relevance, but thankfully, with hearty, last of the independents like Amoeba Records flourishing as beacons in the wilderness. But, downloads and ringtones now have overtaken the brick and mortar market. According to Techcrunch, in 2006, music downloads were increasing at a pace of over 50% a year, while CD sales declined in that year 20%. More recent stats would certainly reflect this trend.

Collaborative filters like the iTunes Genius Bar are only as good as artificial intelligence can be in making associations between individual personal taste and similarities of potential interest. Peer-to-peer sharing of music is still a huge factor, even in the post-Napster universe, with Limewire and others still booming. Sharing lists of favorites on social media networks allows another view into personal taste that speaks to music as first and foremost a community of specialized interests. Music may have actually been the impulse behind the first human communities when their members invariably gathered around a campfire on the African savannah to sing for the hunt to go well and rain to abound—but that’s another story. The affording of samples on services like CD Now and Amazon are likewise helpful, but all of the above tactics still miss some of the mystery for me that exists when you enter a place like Amoeba in LA.

Usually, I am looking for something specific, like a digitized version of an old record—yesterday, for example, I was searching out a copy of The Rolling Stones’ “Their Satanic Majesties Request”—their characteristically dark answer to “Sgt. Pepper”. But, what I usually come out with is anything but what I originally thought I’d be buying. My friend, Jeff Elmassian, a brilliant composer and virtuoso in his own right and CEO of Endless Noise, a premiere music design firm for commercials spoke of an interesting experience while taking his teenage daughters on a pilgrimage to Amoeba.

On finding a certain record she was looking for, one of them told her father that she only wanted one song on the album and didn’t want to buy the whole thing in order to enjoy it. I remember the feeling many times myself when I had to fork up the dough for an entire album in order to claim the one song I liked. Not all albums were created equal and quite often, the hit single was a teaser that was the loss leader for an album that disappointed. We’ve come a long way in the universe of the singular download and shuffle mode mentality.

Singles were another method of music discovery back in the day when they were often pre-releases for albums by new artists as well as established ones like The Beatles, who would lead off with a taste of what was to come. Sometimes, singles had added value when they didn't appear on a follow-up record or when they did, only on a record several years later. The world of digital downloads has put the model of releasing singles on steroids—but now, the consumer has a choice to not buy an entire “album” and very often, there isn’t even a long form version to follow suit. My daughter was telling me last week about a new band whose “album” of four tracks she really liked. I crankily responded that we called a record with so few songs on it an “EP” in my day, and that it didn’t really qualify for the designation of “album” at all.

I forgot how polarizing and magnetic music is until a recent post which elicited a great response of emails and comments for which I am grateful. One such comment came from Kevin Henry, who inspired this present post. He described buying his first single, The Beatles’ “I Want To Hold Your Hand”, “a simple song at best and not earth-shattering my any means”, as he describes it, but one with the inherent power to inspire him remembering “clearly my father yelling to turn that crap off.” He goes on to say: “Today, when I look in the mirror, I wonder who that old guy is and I always sing a little to myself…’hope I die before I get old’...feeling a little sorry for myself and then a magical thing happen the other morning...my 17 year old daughter picked up my iPod by accident on her way out the door and when she walked in that evening she said with a smile...’who are these guys...this stuff is incredible’...and at that moment a connection took place between us as told her the story of my youth and realized that the revolution lives on.”

I’ve had similar cross-generational experiences with my teenage daughter who has embraced a lot of music I grew up with, some of it out of curiosity, some out of enforced listening, and some organically out of her own path of discovery. It’s inspired conversations over the years with younger co-workers at various places I've worked about how great it must have been to experience the 60s and whether “my music is better than your music.” I never quite got that line of attack. If, as Kiki Dee once sung, "I've got the music in me," then what we don't like may result from the fact that the music hasn't connected to where it plays to a harmony inside us.

To me, it’s all a continuum as Kevin Henry's anecdote above reflects so well. But, our first records put a stake in the ground, a tent pole like a clef which affixes music in our memory as the soundtrack to our lives that sets up thematic mileposts made up of sound. They have a way of intersecting our life stories at critical points where music can speak to us as if it were written just for us. Certain records entered my life in this way almost as if they were chapter titles—“Meet The Beatles”, “Absolutely Free”, “Muddy Waters: The Real Folk Blues”, John Coltrane’s “Giant Steps”, Miles Davis “Kind of Blue”, several classical music albums…and the list goes on.

It’s interesting to me that my very first record was a dance song. I had no idea at the time what a cover record was and that Chubby Checker was experiencing success with a number that was originally written and recorded by Hank Ballard. I also had no real idea what sex was at the age of seven either—“To make (beautiful) music with someone or to ‘have sexual intercourse’ is cited by the Online Etymology Dictionary as arriving on the scene in that “seminal” year of 1967. The more recent euphemism of “The Mystery Dance”, may be the more useful expression here.

But, on a primeval level, I guess we're all genetically wired to understand sound as rhythm first, whether it’s the pulse of our blood that steps up with excitement of different kinds, the rhythm of language before we know what words mean, the different kind off beats in the cries that a baby makes depending on her hunger, pain or want of company, the consuming, inspirational sounds of the natural world, the clickety-clack made by toy trains, the delight of tapping out rhythms with a pencil on our school desk to annoy the teacher—or as discoveries in what quantum mechanics has verified in what the Vedantas and the mantra tradition have known for thousands of years—it’s all vibration, man.

Bassist Victor L. Wooten describes it succinctly in his book, The Music Lesson: A Spiritual Search for Growth Through Music: “A-440 means that a note vibrates four hundred and forty times per second…if you keep cutting that number in half, 440, 220, 110, 55, etc., you will eventually get beats per minute. At that point, it’s called rhythm.” The oldest musical instrument that has been documented in the archeological record may be a bone flute from the Upper Paleolithic, but my money would bet that percussion was the original featured instrument of our furry, low browed ancestors. Click sticks like those used by the Aborigines in Australia most likely have forty or fifty thousand years of use. Banging on so-called “ring rocks” or using stones hit against each other seem like another natural movement.

What is music and where does it come from? Wooten refers to its unique origin as a word comprised of an ancient term for “mother” which is “Mu” and “sic” which he attributes as an abbreviation of “science”. Traditional etymology would cite the word's origin as a tribute to the Greek goddesses known as the Muses who are known to have served up a variety of artistic elements for humans to play with. Regardless of its meaning, music is unique in existing in both space (in memory and physical vibration) and time. Its very existence points us to the place that Dizzy Gillespie so eloquently describes as “place between the notes” and as memorialized in John Cage’s famous piece, “4’33”. It is the place where we literally catch our breath, our heartbeat, and where music is created out of the void, out of the great expanive silence, out of that Big Bang of Original Compressed Sound where the first note of song reverberated the original vibration as the Music of the Spheres and frequency that we all carry with us regardless of our preferred musical tastes. Or as the great classical composer of the 20th century, maestro Frank Zappa once said, “Music is the Best”.

I am very interested in readers sharing stories of how their first records impacted their lives and welcome all submissions to the comment section below.

Sunday, May 3, 2009

HOW TO THINK INSIDE THE PYRAMID



When you are sitting inside the King’s Chamber in the Great Pyramid, there are many thoughts that come to you. The sweep of Ancient Egypt and its mysteries are still very present despite their distance of thousands of years from the present. They are also literally quite close just outside the Pyramid where the great Nile flows by, the world’s longest river and arguably still its most mysterious. Inside, just as outside in Egypt, too, the confluence of the sacred meets the profane—the odor of cold limestone is mixed with the faint acrid smell of urine, whether from bats or humans one cannot be sure.

The King’s Chamber lies at the heart of the Great Pyramid and is actually rather small room with dimensions of about 20 by 34 feet—still, it is daunting in its structure with its massive lintel ceiling of 19 feet high. Electric lighting now diminishes the mystery somewhat with several vertical lights framing the empty, lidless sarcophagus carved out of solid granite that is chipped away on one side from years of souvenir hunting as well as from the original intruders who probably used force to open it in the hope of retrieving any of Pharaoh Khufu’s mummified remains. More than anything, the Egyptian impulse is about monumentality and the Great Pyramid is a testament to this factor written characteristically in architectural form.

From the moment you see the Great Pyramid, you are entering a world of epic stone. You are also faced with another key feature of the Ancient Egyptian Mind—the Egyptians ascribed ultimate importance to the way that mathematics and what has come to be known as "sacred geometry" informed original and ongoing creation—and true to form, it’s all a numbers game with the Pyramid as well. The monument is made up of two million limestone blocks averaging two-and-a-half tons and three feet high, with some granite blocks (like those in the ceiling of the King’s Chamber), between 30 and 60 tons each. Experts have estimated that it took 25,000 workers some two decades to build it with tons of stone transported both from local quarries and ones as far as 500 hundred miles away.

Certain parts of the structure seem to defy logic and even gravity—the so-called Grand Gallery which leads you up to the King’s Chamber is one. It truly lives up to its namesake—you enter it from a passageway of about three-and-a-half feet high where you have to duck—into an sprawling expanse that is 157 feet long and 28 feet high. Even as it opens up widely, it’s not really a relief from claustrophobia that gets to you as much as wondering about the stone mass that surrounds you. It seems natural, if not a survival instinct, to consider how this immense weight is distributed and what is holding it all up. It is somehow reassuring that it has apparently done so without shifting since it was constructed.

Despite any misgivings, the overall sense one has is simply wonder and burning questions about its purpose and how it was actually built. Most amazing is not how it was built, perhaps, but that it was built at all and over 5000 years ago. Hollywood movies have memorialized one of the theories with the familiar scenes of thousands of slaves pulling massive blocks under the cruel lashes of overseers’ whips and the monomaniacal eyes of the Pharaoh looking out over the Gizeh plateau and at the ramp extending from the river to scale the emerging manmade mountain. The truth appears far less dramatic. It is clear from recent discoveries by Mark Lehner, in particular, of the village where the workers lived, that they were not slaves, but well treated and fed though accommodations were certainly barracks style without amenities.

A second theory proposes that the ramp did not lead up to the structure as it was built, but rather wrapped around it like a snake until the apex was finally reached and the capstone laid. For decades, these were the only theories besides those that call for alien intervention and levitation. A recent theory has set tradition on its head and has something to teach us about how to think “outside of the box” by considering the inside of the Pyramid.

A new book that is one of the first to actually merit its familiar title, “The Secret of the Great Pyramid: How One Man’s Obsession Led to the Solution of Ancient Egypt’s Greatest Mystery” by Bob Brier and Jean-Pierre Houdin, describes the journey of a French architect’s search for an answer as to how it was built. Houdin’s interest in solving the mystery was inspired by his engineer father. After watching a documentary in 1999 about the construction of the Pyramid, his father told him that the show’s presentation was all wrong. His idea of how the stones were raised to the top was novel if not revolutionary.

A PhD in engineering from Paris prestigious Ecole des Art et Metiers, Henri influenced his son, to create sophisticated 3-D models of the conventional theories to see if they held water, so to speak. His work easily discredited the single ramp theory. In order to deliver the stones to the rising Pyramid, the ramp would have had to have been extended over time as the courses of blocks rose. The basic problem is that the gentle slope that is necessary for workers to haul the blocks would have required the ramp to extend to over a mile long. In other words, “if the Pyramid were being built on the site of New York’s Empire State Building, the ramp would extend all the way into Central Park, about twenty-five city blocks.” Such a ramp would have taken a separate body of thousands of workers many years to construct. Also, it would have produced a tremendous amount of debris, which has never been accounted for in any nearby rubble heaps. Finally, the topography of the plateau just does not avail itself to the creation of such a ramp. It’s too small an area.

The second theory of a ramp that corkscrewed around the ascending Pyramid as it was built did not fare any better. The fatal flaw, it turns out, was that the Pyramid “has four corners, and as the Pyramid grew, the architects had to constantly sight along those corners to make sure the edges were straight and thus ensure that they would meet at a perfect point at the top. But a ramp corkscrewing up the outside would have obscured these sight lines.” So, it would seem impossible for the Ancient Egyptians to accomplish the construction of raising millions of blocks using a stone road that wound up the growing sides of the Pharaoh’s mountain.

Jean-Pierre Houdin spent years computer modeling how the building of the Pyramid progressed over the decades and was able to support his father’s theory through his findings. Interestingly, his father led Jean-Pierre not to look at the outside of the Pyramid for the answer. After years of research, Jean-Pierre proposed that a mile-long ramp corkscrewing to the top was to be discovered inside the Great Pyramid. In other words, it was built from the inside out. Subsequent research and scientific survey on site has been favorable and are outlined in detail in the book.

Take for example, just one aspect of its construction, the Great Pyramid's fabled outside layer. The Pyramid was once covered with flat "facing stones" that provided it with a smooth milky-white shining veneer. It was said that at one time the Pyramid shone hundreds of miles out in the desert like a great beacon.Only the Pyramid of Mycerinus (one of the other pyramids that make up the fabled trio at Gizeh), still has remains of its outside layer if you look toward its top. Unfortunately, the prized outer stones from all three Gizeh pyramids were mostly removed and repurposed at various historical times in the construction of the expanding metropolis of Cairo--including its Great Mosque where some of these original facing stones can be seen today.

Yet, if an outer ramp had been used to lay these precious, smooth faced stones, wouldn't the process have caused damage to their surfaces? If, however, the inside ramp theory is valid, then it would have made far more sense to lay the outer stones first and build in from them laying down the inner blocks, shafts, passageways, and two main chambers. The jury may still be out in terms of how traditional Egyptologists have reacted to Houdin's theory, but to me, the idea makes logical, if not just plain common sense.

Sometimes, we overcomplicate our search for answers by being too influenced by tradition—not only in terms of so-called conventional wisdom or intellectual inheritance—but our sensory bias. More often than not, I find that the art of the strategist is laying out the obvious or what makes common sense, when a client has lost his way in the scaling of his own mountain of business objectives. The requirements of building a business can often immerse the insider in details that distract and sometimes obscure the original essence of why it was created in the first place. Many times, the answer to a business problem is staring us right in the face and is not a matter of creating some nifty theory, body of evidence, and supporting tactics, but relying on our gut and what at first may seem illogical in the face of history or accepted facts.

In looking for our own answers—whether in business or in life—we may have to use less finesse and more brute force in our thinking. We may have to be more like the Arab intruder, Al-Mamoun, who in 820 AD found the original entrance on the Great Pyramid’s north side sealed from within and set about with his men carving out his own entrance. It’s not a pretty sight today, but Al-Mamoun burrowed until he hit one of the monument’s passages and was in like flint. The logic of Jean-Pierre’s theory is transparent and struck me as a breakthrough. It just made sense. So, next time you are trying to “think outside of the box”, maybe it would help to first think about turning it inside out.

There is a reason that the origin of the word “Pyramid” is based on the Ancient Greek words “pyra” and “mesos” literally meaning “fire in the middle.” Maybe the name, itself, is a clue for us to find that creative fire, that so-called “spark” which lights when we discover our own center, to quote the Zuni people. Perhaps the Ancient Greeks and Egyptians were able to identify this as a place in space and time where all the stones of being are connected to the infinite horizon as described by the “original mound” which, in turn, inspired the Pyramid’s divine form. Or as the wondrous English fabulist, Jeanette Winterson said, "Stones are always true. It's the facts that mislead."

Saturday, April 25, 2009

THE CAVE MIND OPERATING SYSTEM


Notorious, fabled “Beat” writer and author of “Naked Lunch”, William S. Burroughs, once defined “paranoia” as “just the state of having all the facts.” Now maybe I’m suffering a little bit from being overwhelmed by facts, but the smiley face has always made me suspicious that it can’t be all that good. At first, Evolutionary biology may not be the most likely refuge of the paranoid, but in the case of the smiley face, it’s brought me nothing less than religion.

The next time your better half, best friend, boss, helpful sales person or gleaming white toothed celebrity smiles at you, think on this—according to Evolutionary biology, the origin of the smile is the reflex that predators make when bearing their teeth at the sight of prospective food. Clearly, there is something we can learn from considering our animal ancestry and in particular, a lot it can teach us about behaviors that we either take for granted, assume we know all about or don’t even question at all. It doesn’t require lifting the veil of time and scrying into the mists of history—it only takes a glimpse at the new gods of sex, drugs, and rock and roll to recognize that we are creatures of biology, first and foremost. Maybe it’s time to use this fact to our advantage once again, given that there are predators like religious fanatics, evil bankers, credit card, and loan sharks on the loose.

I’ve always believed that there’s a lot we can learn from the Upper Paleolithic, a time period when many of our ancestors were retreating from the ice and snow into the solace of fire lit caves. “What can we learn from The Flintstones?” you might ask, besides the fact that all animated shows of yesteryear will at one time or another suffer from being turned into live action features as Hollywood studios trawl the depths of television for recycling purposes. Consider also a trend that Faith Popcorn described in her 1991 book, “The Popcorn Report” which she labeled “cocooning,” whereupon Yuppies are seen as retreating into the new cave of their media centric homes as a way to find relief from the modern rat race. There’s a reason that the root of the word “hearth” is easily found by dropping its final letter “h”. The fireside was once the “heart” of the home and may be again in the form of the postmodern, Green kitchen, if Kevin Henry is right in his latest post on his blog, “The Connected Kitchen”.

My bias is that art usually holds the key to human consciousness at any given time in history and looking at so-called “Prehistoric Art” probably possesses the veritable Keys to the Kingdom. Take for example, the 1879 discovery of the famous cave at Altamira in Spain, which has been called “the Sistine Chapel of Paleolithic Art.” One summer day, a Spanish nobleman and amateur archeologist named Don Marcelino de Sautuola was joined by his young daughter, Maria, in a cave on his estate which he had explored for artifacts many times before. Called by John E. Pfeiffer in his book, "The Creative Explosion: An Inquiry Into the Origins of Art and Religion," “one of the great tales in the annals of prehistory,” this episode can be seen as having something to teach us almost like an Upper Paleolithic OS about the powers of common sense and seeing to through the obvious.

Now the stuff of legend, his daughter (whose age varies according the particular account from five and seven to twelve), had wandered into a small, side chamber that was three-to-five feet high in most places. Don Marcelino had traversed it numerous times without noticing what made his daughter cry out loud, “Toros, toros, toros!” Interesting was that in his search for stone artifacts, he was always scouring the floor of the cave and had never actually looked up at the ceiling. As Pfeiffer describes it, “Nothing had prepared Sautuola for the shock of such a discovery. He had explored the chamber and thought he knew what was in it.” While he had used his lantern to avoid being bumped on the head by the protuberances that were covered with vivid paintings in black, red, pinks, and browns, it was by the lantern light that his child made the discovery simply by looking up. Little did she realize that in doing so and revealing the hidden prehistoric art that it would turn her father into an advocate tied to evolutionary theory and to his grave be much maligned as a crank and charlatan by the then protectionist, doubting world of traditional archeology.

An inspiration for his courage in facing harsh criticism that saw the cave paintings as forgeries, his story provides us with OS Principle Number One from the Upper Paleolithic:

1. SOMETIMES A BUMP ON THE HEAD IS A GOOD THING

In other words, always Look Up in addition to staring at your feet! This is also known as OVER, UNDER, SIDEWAYS, DOWN or the Yardbirds’ Principle since it’s named after their 1965 hit.

Many of the famous caves in Western Europe from the Upper Paleolithic were discovered by children. This includes the most celebrated one of all, France’s Lascaux Cave, discovered in 1940 by four youths who were chasing a pet dog named Robot, who had disappeared into a hole in the ground that turned out to lead to the great subterranean galleries below. Some are even named after their youthful discoverers like “Les Trois Freres” after the three young brothers who first crawled its lengths.

As Pfeiffer says about Maria, the discoverer of the Altamira cave, “…she was too young to have acquired a bias against looking up rather than looking down.” He continues that her father, “…had no real interest in the walls or ceiling of the cave. He was an excavator interested above all in what he could find at his feet, on the floor, such things as flint artifacts and bones and remains of hearths. The low ceiling of the side chamber was only a hazard to him, something to avoid.” The point is that life is at the very least, three-dimensional and we need to see ourselves both inside and out of the box in order to be creative and truly “think outside the box”.

This leads us inevitably to OS Principle Number Two from the Upper Paleolithic:

2. WHEN IN DOUBT, ASK A CHILD

When in doubt, don’t let age or experience be a factor. I remember once when my daughter was four and she asked me, “Daddy, why does infinity never stop?” For the first time as a parent, I had the survival instinct to ask her instead of trying to come up with any sort of reasonable answer. “What do you think, honey?” I asked her. Without losing a beat she replied, “Because they ran out of numbers!” You might be astounded by the insights offered by the unbiased eyes of the culturally agnostic and the brains of young souls who are closer to the tabula rasa.

Pfeiffer says, “Archeological records include many cases of art overlooked. The eye never comes innocent to its subject. Everything seen is a blend of what actually exists out there, the “real” object, and the viewer’s expectations, upbringing, and current state of mind. It is amazing what you can miss when you do not expect to see anything or, given a strong enough motive, what you can see that is not there. Unless the mind is properly adjusted or set, anticipating a revelation of a particular sort, nothing happens.”

Principle three, therefore, follows this theme of perception:

3. YOU CAN’T BELIEVE EVERYTHING YOU SEE AND HEAR, CAN YOU?

Otherwise known as the “Up From The Skies” Principle after the lyrics from Jimi Hendrix. Or better yet, it could be called “Anticipate Revelation.”

Why do the Aboriginal people of Australia believe that our world is the dream and that the true world is the Dreamtime beyond our consensus reality? With 40,000 to 50,000 years of experience to draw upon, one has to ask the question. Like the San people of South Africa, the Aborigines are one of the only cultures who still have an ongoing tradition of painting caves.

You don’t have to get tribal to appreciate the Other Side of the Sky. There is a story about visionary English poet, William Blake, that is a case in point. Upon hearing a knock on the door, his wife once answered the caller’s inquiry as to whether Mr. Blake was at home, by responding: “No. He spends most of his time in heaven.”

Shamanic cultures tend not to throw out anything that works. In other words, if you are bent on survival, why dispose of the practical. This is just one factor that supports the efficacy of shamanism as an alternative medical practice as well as a way of seeing that there are many more worlds than ordinary “9 to 5” reality. Chief Seattle took this to its logical conclusion when he said, “There is no death, only a change of worlds.”

Like quantum physicists, cave dwellers and modern tribal peoples believe that the stone walls of caves are more like membranes between this world and that of the ancestors. So, placing a painting of one’s own handprint on top of an ancestor’s creates a link where one is able to touch and pass through to a kind of historic continuum to the ancestral chain of being. Drawing an animal is believed to have been an appeal on the part of hunters to ask permission of their quarry’s spirit prior to hunting for food.

The representations of animals in the Upper Paleolithic caves are so realistic that they seem to breathe, especially in torchlight and placed as they often are on outcrops that enhance their shape—the artists were obviously very familiar at close range to their subject and their depictions are in many cases without peer in the millennia that have transpired since. No less than the like of Picasso testified to this when, after seeing the extinct Altamira bison created 15,000 thousand years previously, remarked: “ None of us could paint like that.”

This raises how art enters the picture, which brings us to principle number four:

4. WHAT IS WORK TO ONE CAVEMAN IS ANOTHER MAN’S ART

In his illuminating book on cognitive archeology, “Shamanism and the Ancient Mind”, James L. Pearson says: “From the first discovery of prehistoric painting at Altamira to the stunning finds at Grotte Cosquer and Chauvet Cave in the 1990’s, researchers have tried to uncover the meaning of this Ice Age art and the function of the painted caves.” The field of study that undertakes to explore the caves and other sites associated with such decoration is called “Rock Art,” a label that, while helpful for academics, presents some semantic problems when looked at with the tribal eye.

The basic issue is not only how to define art—a challenge we’ll leave to the experts for now—but according to Steven Mithen in “The Prehistory of the Mind: The Cognitive Origins of Art and Science”: “…the definition of art is culturally specific. Indeed many societies who create splendid rock paintings do not have a word for art in their language.”

Is “rock art” art if the producers didn’t think so? For most of the 20th century, prevailing wisdom associated cave art with hunting magic. Others scholars and researchers like Mircea Eliade, Joan Halifax, Weston La Barre, Andreas Lommel, and David Whitley suggested that Lascaux, Les Trois Freres and other rock art sites depicted shamans and supernatural helpers.

World-renowned authorities Jean Clottes and David Lewis-Williams expanded on a preceding neuropsychological model and combined it with ethnography in their 1998 book “The Shamans of Prehistory: Trance and Magic in the Painted Caves”: “The way in which each individual cave was structured and decorated was a unique result of the interaction of four elements: the topography of the cave, its passages, and chambers; the universal functioning of the human nervous system and in particular, how it behaves in altered states; the social conditions, cosmologies, and religious beliefs of the different times at which a cave was used; and lastly, the catalyst—the ways in which individual people and groups of people exploited and manipulated all of these elements for their own purposes.”

The most fascinating connection that is made in the neuropsychological model is between the actual symbolic elements in rock art and phosphene action that takes place in the human eye, whether during altered or natural states. Just rub your eyes and you’ll see these shapes and signs that are created by the firing of the optic nerve. Many are universal forms like jigsaws, dots, rakes, and spirals that appear throughout rock art sites across millennia and all over the world. The question then becomes how much of what we see is conscious and how much is not?

In this view, what is considered as representational art has a connection to the ability to create symbols with intention, and turns creative expression into an index as to the level of consciousness of a specific culture at a particular time and space. Cognitive archeology says that the production of representational art requires a certain brain capacity that sees outside itself. When I’ve taken tribal people from different cultures to see the rock art sites in the local Santa Monica Mountains, they are always careful to offer interpretations circumscribed by their own culture. “To us,” they start with a disclaimer, “these paintings represent clan symbols.” But, they are always deferential about the meaning, intent or purpose for the tribe who created them. This perspective leads to our next principle:

5. SOMETIMES A CIGAR IS JUST A CIGAR

As Freud famously said. The bottom line in terms of my own experience at rock art sites is that you can’t dismiss that some of paintings and petroglyphs were just doodling and a sort of tribal version of “Kilroy Was Here” message. Maybe it was just a fine day around the water hole where hunter-gatherers had the luxury of some extra time on their hands and thought to memorialize their afternoon with their mark. So, we have to consider that some of the “art” may have not been conceived of as representational or symbolic at all, but just as either functional—as with hunting magic—or doodles that were pleasing to the eye but meant nothing more. But, one of the manias of our scientific age is to attempt to find a rational way to explain everything.

One of the difficulties in rock art research is that there is no Rosetta Stone handy to decipher pictographs and petroglyphs. Outside of cultures with living traditions of rock art like the Aborigine and San people, it is not straight forward to interpret what they mean. Instead, we are often left with the beautiful problem of confronting meaning ourselves as a primary experience without interpretation—with nothing between us and the original maker of the markings—and a rare occurrence that we should treasure in this media immersive world that interprets our experience of the world to death for us in over three thousand advertisements, logos, and consumer messages a day.

So, what may be art to us with historical distance from the circumstances and cultural context in which cave paintings were created, they may have had quite a functional purpose to those who originally produced it, whether it was to evoke the ancestors, supernatural or animal powers or clan territory. My take is that even though the scientific method and was born out of the Age of Reason and out of rejection of religious belief, it still is based in part on fear of the Unknown. The search for meaning is one way to moderate fear, leading naturally to our next precept:

6. DON’T BE AFRAID OF THE DARK

Sometimes the light at the end of the tunnel is not a train. One of the most impressive thing about caves is the kind of absolute darkness that we ordinarily never experience. To enter one, you often have to deal with fear. On some primordial level, you feel as if you are leaving the lighted world to say nothing of carrying along the cultural baggage of the collective unconscious associated with the netherworld.

Imagine what it must have been like to descend into one of these places as a twelve-year-old initiate in Upper Paleolithic society, led by the most frightening person in the tribe—the shaman—and making your way by hook and by crook, on your hands and knees, in the mud and underground streams, listening to the drip-drip-drip of water seeping from the land above mixed with the strange sounds of nether dwelling life forms and suddenly seeing forms of animals and other strange shapes come alive with lighting of the shaman’s lamp. It probably was an experience that would give religion to any one of us.

A recent book by Martin Lindstrom called “Buyology: Truth and Lies About Why We Buy” features excellent material relating to the use of fMRI technology, but one finding, in particular, is quite surprising. The results of fMRI scans have demonstrated a connection between religion and consumer behavior. Apparently, experiments showed that the part of the brain activated during religious ceremonies and experiences is the same as the region which is active during shopping, watching commercials, and gazing at corporate logos.

Ad Age reported on April 6 about findings from the New York Buyology Symposium that presented brain scanning data making correlations between “cult-like brands” such as Harley Davidson and Ferrari and the emotional drivers associated with believers in the world’s largest religion, Christianity.

Dr. Gregory Berns is a psychiatrist who is also a leading authority on neuroeconomics, and biomedical engineering. Neuroeconomics is a study that combines neurology, psychology and economics and looks at understanding how individuals and groups make decisions, take risks, and experience rewards. One of the primary tools that they use is fMRI (functional magnetic resonance imaging) which studies brain response by correlating specific neuron firing in the brain with the decision making process.

Berns’ new book, "Iconoclast: A Neuroscientist Reveals How To Think Differently," reveals the limitations that the fear response places on creativity and innovation. In an extended interview in the current edition of Super Consciousness magazine, he says, “The importance of the distinctions in how each of us sees the world cannot be underestimated…perception is not something that is immutably hardwired into the brain.” He profiles recent findings that show how we are capable of transforming the way we perceive life and can redirect neurological firing. It’s not an easy feat, requiring extraordinary mental training and energy, but the idea stands as one of the fundamental principles of neuroeconomics.

According to Berns: “…one of the brain’s primary survival mechanisms is conserving energy. The brain does this by limiting energy expenditure during normal everyday awareness…for most people, though, breaking out of the comfort zone of their energy conservative perceptions is often a fearful proposition.”

He goes on to say that fear limits our ability to be creative and is a huge impediment to innovation. Similar to Malcolm Gladwell’s recent treatment in “Outliers”, Berns sees the great innovators as outsiders and iconoclasts who are able to face risk, “but cognitively reframe (such situations) so as to estimate some kind of likelihood of success or failure to make a decision.” He calls it “the optimism bias,” which allows them to “downplay negative scenarios” as opposed to buy into the uncertainty or ambiguity that are at the root of primal fear. It’s interesting that structural ambiguity is a feature of many video game design as described, for example, by Jim Gasperini in “Structural Ambiguity: An Emerging Interactive Aesthetic” in “Information Design” edited by Robert Jacobsen.

Economists call one kind of uncertainty risk where there is a possibility of success or failure, but one can estimate the odds and determine some likelihood of the outcome. Neuroscience indicates that the fear response is generated when we don’t have a complete picture or a state of ambiguity. The current financial crisis has inspired fear, according to Berns, because we don’t have all the facts about how deep it is and how far it’s going to go.

In a May Atlantic article about the financial crisis, Cody Lundin says, “Risk-taking went over the edge. We are inventing something new. We’re very afraid. We know from the Depression that people who lived through it didn’t change their mentality for the rest of their lives. They were sewing socks. They refused to take a lot of chances. My sense is that it will take 10 or 20 years to find that spark of risk-taking in people again.”

The way that we approach risk is at the basis of strategy. One of the things that our hunter-gatherer ancestors learned from animals is low risk behavior. Berns describes it as, “…head in the sand, everyone in the bunker, cut back spending, hoard what I have, and wait for the storm to pass. That is a very instinctual response and again, goes back to the survival instinct. When you are afraid, you tend to retreat and hoard what you have. Animals that have the capacity to think through the situation just wait it out. That is a low risk strategy and will probably work to maintain your status quo, which is fine if that is what you want. The innovator sees everyone else doing that, and it is precisely in those circumstances that it makes the most sense for them to take risks.”

Fear is, therefore, not the optimal operating system. In “The Science of Fear”, Daniel Gardner demonstrates how many irrational fears are based on the way that humans miscalculate risks. To be creative, perhaps innovate, and ultimately, to succeed, we need to transcend fear of the cave of the mind. In one of his notebooks, Leonardo Da Vinci wrote: “Drawn by my eager wish, desirous of seeing the great confusion of the various strange forms created by ingenious nature, I wandered for some time among the shadowed cliffs and came to the entrance of a great cavern. I remained before it for a while stupefied and ignorant of the existence of such a thing. With my back bent and my left hand resting on my knee, and shading my eyes with my right, with lids lowered and closed, and often bending this way and that to see whether I could discern anything within. But this was denied me by the great darkness inside and after I stayed a while, there arose in me two things: fear and desire. Fear, because of the menacing dark cave, and desire to see whether there were any miraculous thing within.”

Gregory Cochran and Henry Harpending argue in their new book, “The 10,000 Year Explosion: How Civilization Accelerated Human Evolution” that recent genetic change has been far more expansive than the traditional “great leap forward” that scientists believed defined human beings some 40,000 to 50,000 years ago. This was the period which also gave us the birth of artistic expression with examples of the so-called Venus sculptures appearing 40,000 to 60,000 years ago and a flute dated at some 54,000 years. The actual beginnings of art are the subject of much debate and estimates can range up to 100,000 years ago. What is agreed on is that a creative explosion took place around 30,000 years ago, the date of the Chauvet Cave and amazingly, in full development. Whether Cochran and Harpending’s theory has validity or not, I still think that the invention of fire is pretty hard to top with language and art a close second and third. The nature of images, whether art or otherwise, leads to our final principle:

7. THINK BEYOND WORDS

Maybe it all comes down to what Fred Barnard once said in 1921 when he coined the expression, “a picture is worth a thousand words.” He was speaking about the signs on the sides of streetcars. No matter, if the cave mind operating system has a purpose for us today, it’s because it drives us with mysterious images to think beyond words, to face our fears, and find consciousness in the stars that light up our brains.