Translate this posting

Saturday, September 27, 2014

Cookie-Cutter People

“In order to be irreplaceable, one must always be different.”
                                                                                                 Coco Chanel

The Town of Truro sent two of its paid flaneurs* to our house last week to appraise it for a revised tax assessment. While one of them sauntered through the rooms snapping photos, I mentioned to the other that our house might be difficult to evaluate easily because it is unique. He explained that house assessments are determined by an algorithm. Plug in the metrics and a computer spits out the valuation you are taxed on.

“We don’t use the word ‘unique’ and you shouldn’t either,” he told me soto voce. “Banks don’t like to finance ‘unique’ dwellings because of resale.”

And there you have one of the significant shortcomings of our society. Homogeneity.

We hear so much about the importance of being true to yourself, of thinking outslde the box. But in practice, we’ve become cookie-cutter people: from chain restaurants serving up taste-alike food to chain clothing stores serving up look-alike fashions.

Did this ubiquitous and tedious conformity begin when we started telling every child in the school that he or she was special? Did it start when no kid left the playing field a loser?

This from Nolan Bushnell, founder of Atari – and former boss and mentor of Steve Jobs:

“A lot of what is wrong with corporate America has to do with a culture filled with antibodies trained to expel anything different. HR departments often want cookie-cutter employees, which inevitably results in cookie-cutter solutions.”

In 1956, William H. Whyte published The Organization Man. It was regarded as a breakthrough sociological commentary and became a bestseller because it so courageously described what was happening on a mass scale to post-war American society: television, affordable cars, fast food. Families were nuclear, and “following your bliss” led to planned suburban communities like the 1950 California tract housing pictured above.

Whyte was alarmed by this phenomenon and he wanted us to be alarmed, too. The American belief in the perfectibility of society, he wrote, was shifting from one of individual initiative to one achieved at the expense of the individual:

“Once upon a time it was conventional for young men to view the group life of the big corporations as one of its principal disadvantages. Today, they see it as a positive boon.”

A few years later, Richard Yates published his first novel, Revolutionary Road, which was nominated for the 1962 National Book Award. It illustrated the underbelly of Ward and June Cleaver’s TV family: the idealized model of life-long career, two-child family and sensible house in the suburbs.

In the in 2008 movie adaptation, frustrated housewife April Wheeler tells her husband, Frank, in a pivotal scene:

“Our whole existence here is based on this great premise that we're special. That we're superior to the whole thing. But we're not. We're just like everyone else! We bought into the same, ridiculous delusion.”

Today we have “politically correct” thinking. The late British Prime Minister Margaret Thatcher used to call it “fashionable consensus.” And it just might be more insidious than Whyte’s “group life” … because it has morphed into “groupthink.”

Here’s April Wheeler again:

“Tell me the truth, Frank. Remember that? We used to live by it. And you know what's so good about the truth? Everyone knows what it is however long they've lived without it. No one forgets the truth, Frank, they just get better at lying.”

(*Flaneur: A man who saunters around observing society)


In my next blog, “A Life in Black and White”

Saturday, September 20, 2014

Once Upon a Time


I read my friend Jeffrey Alexander’s newest book, Obama Power, in one rainy Sunday afternoon on Cape Cod. It asserts that although Obama was written off by pundits as a one-term wonder following the Democratic congressional losses of 2010, he won re-election two years later -- by using story-telling techniques we’ve known since humans sat around the fire in caves.

With the State of the Union Address in 2011, writes Alexander, the Lillian Chavenson Saden Professor of Sociology at Yale, Obama created a fictional character and drew a plot line that ended in, “This is what change looks like.”

Opponent Mitt Romney, on the other hand, had little difficulty putting points on the board, but he had problems narrating himself heroically. Prof. Alexander quotes Peggy Noonan, speechwriter for President George H. W. Bush: “Mr. Romney couldn’t articulate a way forward, and nobody knew what his presidency would look like.”

“It is story-telling, not policy,” concludes Prof. Alexander, “that defines presidential success.”

Because perception can define performance, story-telling also has become corporate America’s latest buzzword for everything from brand marketing to social media to employment resumes.

Persuasion is the core of commerce. Customers must be sold on a product, employees motivated to buy into a strategy, investors convinced to trust in a stock. But despite the critical importance of persuasion, most executives struggle to communicate, let alone to influence and inspire.

Robert McKee is an award-winning screenwriter and director. In a classic Harvard Business Review article some years ago, he was quoted as saying:

“There are two ways to persuade people. The first is by using conventional rhetoric, which is what most executives are trained in. It's an intellectual process, and in the business world it usually consists of a PowerPoint slide presentation in which you say, ‘Here is our company's biggest challenge, and here is what we need to do to prosper.’ And you build your case by giving statistics and facts and quotes from authorities. But there are two problems with rhetoric. First, the people you're talking to have their own set of authorities, statistics, and experiences. While you're trying to persuade them, they are arguing with you in their heads. Second, if you do succeed in persuading them, you've done so only on an intellectual basis. That's not good enough, because people are not inspired to act by reason alone.

The other way to persuade people—and ultimately a much more powerful way—is by uniting an idea with an emotion. The best way to do that is by telling a compelling story. In a story, you not only weave a lot of information into the telling but you also arouse your listener's emotions and energy. If you can harness imagination and the principles of a well-told story, then you get people rising to their feet amid thunderous applause instead of yawning and ignoring you.”

Persuasion is the centerpiece not only of business activity, but also of much human endeavor. And we’ve struggled forever about how to do it.

As long ago as the fourth century B.C., for example, Aristotle was wondering what makes a speech persuasive. He came up with three principles:  ethical appeal, emotional appeal and logical appeal. A rhetorician strong on all three, he said, was likely to produce a persuaded audience.

Replace the word rhetorician with politician … or executive … or brand … and Aristotle’s insights seem entirely modern.

In my next blog, “Cookie-Cutter People”

Saturday, September 13, 2014

Recess Rhythms


There is an elementary school about a half-mile from my house in Vieques, and when the school year is in session, I can hear the distant playground sounds of the children.

Ever notice that playground noise sounds the same, whether the kids are speaking Spanish, English or Swahili?

It brings to mind the sounds of my own childhood in Perth Amboy, New Jersey, in the Fifties. Especially the rhythms and rhymes of the girls playing jump rope during recess.

Back in those fifth-grade days, I attended Shull School, a big, classically designed school set on a hill. On either side of the building were playgrounds, situated above sidewalk level. There was a playground for boys and another for girls, just as there were a boys’ entry door and staircase as well as a similar arrangement for the girls.

Now, each generation of little kids believes they are the first to think up novel ways to deceive teachers. In my case, I ran with a pack of little perverts who thought we were the first to notice that all the girls wore dresses or skirts and if we casually stood on the sidewalk below the girls’ playground, we could nonchalantly look up as the girls jumped rope -- and treat ourselves to a peek of gam or, if the gods were kind that day, a blur of underwear.

Skipping rope, for some reason, has always been done almost exclusively by girls. Maybe because girls are better than boys at displaying athletic poise while articulating memorized or spontaneous rhyming patterns. It could be two girls swinging the rope, for example, sometimes swinging two ropes simultaneously (as in the 1946 photo above), or even two girls in the middle, skipping in unison.

Over on the boys’ playground, meanwhile, we goonies just ran around chaotically or engaged in fistfights. 

Where did the tradition of skipping rope to the cadences of rhythmic rhymes come from? I haven’t found any definitive answer. Girls make them up, it seems, and teach them to one another and to younger girls.

Girls and boys have their own parallel cultures and spread stories and rhymes and bits of nonsense to one another, passing them down to younger children and forgetting them as they grow up. There's a whole world of creativity going on underneath our noses, of which we adults are largely unaware, despite having participated in it ourselves at one time.

The rhythms we hear during recess do have effect, though, and affect our point of view.
One folklorist theorizes that some girls’ rhymes hint at fears of puberty and the consequences of sex – in masked language:

Cinderella, dressed in yellow,
Climbed the stairs
To kiss a fellow.
Kissed a snake
By mistake.
How many doctors
Will it take?

Some rhymes might be nothing more than a clever way of being naughty --without rousing the ire of teachers and parents:

Miss Annie had a steamboat
The steamboat had a bell
Miss Annie went to Heaven
The steamboat went to
Hell-o
Operator
Give me number nine
If you disconnect me
I'll kick your fat
Behind
The 'frigerator
There was a piece of glass
Mary sat upon it
And broke her big fat
As-k
Me no more questions
I'll tell you no more lies
Tell that to your mother
The day before she dies

One positive aspect of rhyming has been demonstrated conclusively: familiarity with rhymes is a strong foundation for reading literacy. Studies confirm that the better children are at detecting rhymes, the quicker and more successful they will be at learning to read, despite any differences in class background, general intelligence or memory ability.

Shouldn’t publishers of children’s books know this kind of thing? So why did Dr. Seuss – the father of rhymed stories for children -- suffer rejections by 27 publishers before his first book was printed?

In my next blog, “Once Upon a Time”

Saturday, September 6, 2014

Adam’s Curse


We are afflicted with what’s been called Adam’s curse -- awareness of our own mortality.

Unlike you and I, for example, dogs don’t reflect upon themselves or worry that their breath is bad. Their self-awareness is limited.

Living here in paradise, why am I entertaining these dark thoughts? Because I spent last weekend in Philadelphia, visiting my grandson, Connor, who’s studying biology sciences and psychology at Drexel University. Connor has developed a keen interest in neuroscience and psychology. He’d like to someday improve the way these two fields can be applied -- through research and clinical work -- to medicine and spirituality.

He gave me a book to read so that I might understand his ambition about unlocking the secrets of the mind: Consciousness, by Christof Koch, a colleague of DNA discoverer Francis Crick. Now, thanks to Connor, I can’t stop wondering about a subject few of us ever think about.

In reading the book, I learned that our inner world of mind, soul and spirit is more a mystery than is the external universe. It comes down to one simple question: “How can something physical (brain matter) give rise to something nonphysical (feelings)?”

Think about this. Georges Lemaitre, who died in 1966, is the acknowledged "father of the Big Bang." A Belgian Catholic priest, Fr. Lemaitre, while still a junior lecturer at the Catholic University of Louvain, proposed an expansionary theory of the universe at odds with the prevailing belief that the universe had always existed in a steady state. He asserted that the entire universe began with what he called a "cosmic egg" or "primeval atom" -- a theory that Sir Fred Hoyle derisively dismissed as "the Big Bang." Fr. Lemaitre also argued that not only was the universe expanding, but the speed of its expansion was accelerating. To Sir Fred’s chagrin, the priest's theories have been substantively confirmed.

Yet, scientists and scholars still don’t know what our inner, mental world is made of -- much less understand why it exists at all.

In other words, astronomers can make statements with surety about the Big Bang -- an event that took place 13.7 billion years ago. But we are baffled by the processes that make us aware of a toothache. Here’s Christof Koch:

“How the brain converts bioelectrical activity into subjective states, how photons reflected off water are magically transformed into the perception of an iridescent aquamarine mountain lake is a puzzle. The nature of the relationship between the nervous system and consciousness remains elusive and the subject of heated and interminable debates.”

The lack of any true scientific understanding of consciousness – especially when you consider the feats of science in other fields -- leaves lots of questions. In fact, many philosophers, scientists and medical experts accept the possibility that consciousness may rise from a source that is beyond the physical.

The concept of free will, for instance, has baffled scientists throughout the centuries. How is it that we humans are able to bring ideas and actions into existence from nothingness? This defies the most basic physical law -- cause and effect.

Judy Bachrach, in her recently published book, Glimpsing Heaven: The Stories and Science of Life After Death, writes:

“This is an area where a lot more scientific research has to be done: that the brain is possibly, and I'm emphasizing the 'possibly,' not the only area of consciousness. That even when the brain is shut down, on certain occasions consciousness endures. One of the doctors I interviewed, a cardiologist in Holland, believes that consciousness may go on forever. So the postulate among some scientists is that the brain is not the only locus of thought.”

In a world where science has pretty much tossed out non-physical concepts from serious inquiry, paradoxes like this remain – the human capacity to “create through intention.”

Where do we go from here? Perhaps:
  • Continuing deep curiosity about the role of spirituality
  • A profound responsibility to hone for the better the “seemingly divine” tool of conscious awareness of ourselves and of our world
  • Final acceptance that each person wields wondrous power to manipulate the world in any way we please
Charles Duell was the commissioner of the U. S. Patent Office in 1899. He is most famous because of a quote attributed to him: "Everything that can be invented has been invented."
In hindsight, we realize that if Mr. Duell did in fact utter those words, he was an ignoramus.

The lesson for me is that, just when I’ve reached the advanced age when I start to actually buy into the idea that I have achieved some level of “wisdom,” along comes a grandchild – perhaps one I once taught to tie a shoelace – to teach me how little I really know … and how much more there is to learn.


In my next blog, “Recess Rhythms”