Tag Archives: The Wall Street Journal

Declare Your Independence: Those high heels are killers

Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. 

I’ve revised this post in light of changes that have taken place during the past year and a couple of new ideas I want to pass along.

My newly revised post follows:

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body. 

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.” 

During the past two years, largely dominated by the global pandemic, many women and men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  [Please see my post, “Two Words,” published July 15, 2020, focusing on pants with elastic waists.]

In particular, many women abandoned the wearing of high heels.  Staying close to home, wearing comfortable clothes, they saw no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women chose to seek out venues requiring any sort of fancy clothes or footwear.  

But as the pandemic began to loosen its grip, some women were tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing was noted in publications like The New York Times and The Wall Street Journal.   In a story in the Times, one woman “flicked the dust off her…high-heeled lavender pumps” that she’d put away for months and got ready to wear them to a birthday gathering.  According to the Times, some are seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay.  I get it.  “Dressing up” may be your thing after a couple of years relying on sweatpants.  But “towering heels”?  They may look beautiful, they may be alluring….

BUT don’t do it!  Please take my advice and don’t return to wearing the kind of shoes that will hobble you once again..

Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I nevertheless managed to secure ample male attention. 

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a trend toward higher and higher heels, and I found it troubling.  I was baffled by women, especially young women, who bought into the mindset that they had to follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.  

When I’d watch TV, I’d see too many women wearing stilettos that forced them into the ungainly walk I briefly sported so long ago.  I couldn’t help noticing the women on late-night TV shows who were otherwise smartly attired and often very smart (in the other sense of the word), yet wore ridiculously high heels that forced them to greet their hosts with that same ungainly walk.  Some appeared to be almost on the verge of toppling over. 

Sadly, this phenomenon has reappeared. On late-night TV, otherwise enlightened women are once again wearing absurdly high heels.

So…what about the women, like me, who adopted lower-heeled shoes instead?  I think we’ve been much smarter and much less likely to fall on our faces. One very smart woman who’s still a fashion icon: the late Hollywood film star Audrey Hepburn. Audrey dressed smartly, in both senses of the word.

I recently watched her 1963 smash film Charade for the eighth or tenth time. I especially noted how elegant she appeared in her Givenchy wardrobe and her–yes–low heels. Audrey was well known for wearing comfortable low heels in her private life as well as in her films. [Please see my blog post: https://susanjustwrites.com/2013/08/08/audrey-hepburn-and-me/….]

In Charade, paired with Cary Grant, another ultra-classy human being, she’s seen running up and down countless stairs in Paris Metro stations, chased by Cary Grant not only on those stairs but also through the streets of Paris. She couldn’t have possibly done all that frantic running in high heels!

Foot-care professionals have soundly supported my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

A few years ago, the San Francisco Chronicle questioned a podiatrist and foot and ankle surgeon who practiced in Palo Alto (and assisted Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He did not endorse wearing 3-inch heels and pointed out that celebrities wear them for only a short time, not all day.  To ensure a truly comfortable shoe, he added, no one should go above a 1.5-inch heel.  If you insist on wearing higher heels, you should limit how much time you spend in them.

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of America’s major shoe-sellers, began to promote lower-heeled styles along with higher-heeled numbers.  I was encouraged because Nordstrom is a bellwether in the fashion world, and its choices can influence shoe-seekers.  At the same time, I wondered whether Nordstrom was reflecting what its shoppers had already told the stores’ decision-makers.  The almighty power of the purse—how shoppers were choosing to spend their money–-probably played a big role.

The pandemic may have changed the dynamics of shoe-purchasing, at least at the beginning. For the first year, sales of high heels languished, “teetering on the edge of extinction,” according to the Times.  Today, the pandemic may be a somewhat less frightening presence in our lives, and there are undoubtedly women who will decide to resurrect the high heels already in their closets.  They, and others, may be inspired to buy new ones.

I hope these women don’t act in haste.  Beyond the issue of comfort, let’s remember that high heels present a far more serious problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When they need to flee a dangerous situation, high heels can handicap their ability to escape.

How many needless deaths have resulted from hobbled feet?

Gen Z shoppers can provide a clue to the future. They largely eschew high heels, choosing glamorous sneakers instead–even with dressy prom dresses.

My own current faves: I wear black Sketchers almost everywhere. I occasionally choose my old standby, Reeboks, for serious walking. [In my novel Red Diana, protagonist Karen Clark laces on her Reeboks for a lengthy jaunt, just as I do.] And when warm temperatures dominate, I’m wearing walking sandals, like those sold by Clarks, Teva, and Ecco.

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I once again urge the women of America to declare their independence from high-heeled shoes. 

If you’re currently thinking about returning to painful footwear, think again.  You’d be wiser to reconsider.

I encourage you to bravely gather any high heels you’ve clung to during the pandemic and throw those shoes away.  At the very least, keep them out of sight in the back of your closet.  And don’t even think about buying new ones.  Shod yourself instead in shoes that allow you to walk in comfort—and if need be, to run.

Your wretched appendages, yearning to be free, will be forever grateful.

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

Declare Your Independence: Those high heels are killers

Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. 

I’ve revised this post in light of changes that have taken place during the past year.

My newly revised post follows:

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body. 

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.” 

During the past year, one dominated by the global pandemic, many women and men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  [Please see my post, “Two Words,” published July 15, 2020, focusing on wearing pants with elastic waists.]

In particular, many women abandoned the wearing of high heels.  Staying close to home, wearing comfortable clothes, they saw no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women chose to seek out venues requiring any sort of fancy clothes or footwear.  

As the pandemic has loosened its grip, at least in many parts of the country, some women have been tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing has been noted in publications like The New York Times and The Wall Street Journal.   In a recent story in the Times, one woman “flicked the dust off her…high-heeled lavender pumps” that she’d put away for months and got ready to wear them to a birthday gathering.  According to the Times, some are seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay.  I get it.  “Dressing up” may be your thing after more than a year of relying on sweatpants.  But “towering heels”?  They may look beautiful, they may be alluring….

BUT don’t do it!  Please take my advice and don’t return to wearing the kind of shoes that will hobble you once again..

Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I nevertheless managed to secure ample male attention. 

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a trend toward higher and higher heels, and I found it troubling.  I was baffled by women, especially young women, who bought into the mindset that they had to follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.  

When I’d watch TV, I’d see too many women wearing stilettos that forced them into the ungainly walk I briefly sported so long ago.  I couldn’t help noticing the women on late-night TV shows who were otherwise smartly attired and often very smart (in the other sense of the word), yet wore ridiculously high heels that forced them to greet their hosts with that same ungainly walk.  Some appeared to be almost on the verge of toppling over. 

On one of the last in-person Oscar Awards telecasts (before they became virtual), women tottered to the stage in ultra-high heels, often accompanied by escorts who kindly held onto them to prevent their embarrassing descent into the orchestra pit.

So…what about the women, like me, who adopted lower-heeled shoes instead?  I think we’ve been much smarter and much less likely to fall on our faces.

Foot-care professionals have soundly supported my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

A couple of years ago, the San Francisco Chronicle questioned Dr. Amol Saxena, a podiatrist and foot and ankle surgeon who practiced in Palo Alto (and assisted Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He did not endorse wearing 3-inch heels and pointed out that celebrities wear them for only a short time, not all day.  To ensure a truly comfortable shoe, he added, no one should go above a 1.5-inch heel.  If you insist on wearing higher heels, you should limit how much time you spend in them.

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of America’s major shoe-sellers, began to promote lower-heeled styles along with higher-heeled numbers.  I was encouraged because Nordstrom is a bellwether in the fashion world, and its choices can influence shoe-seekers.  At the same time, I wondered whether Nordstrom was reflecting what its shoppers had already told the stores’ decision-makers.  The almighty power of the purse—how shoppers were choosing to spend their money–probably played a big role.

But the pandemic may have completely changed the dynamics of shoe-purchasing.  Once we faced the reality of the pandemic, and it then stuck around for months, sales of high heels languished, “teetering on the edge of extinction,” according to the Times

Today, with the pandemic a somewhat less frightening presence in our lives, there are undoubtedly women who will decide to resurrect the high heels already in their closets.  They, and others, may be inspired to buy new ones, dramatically changing the statistics—and their well-being.

I hope these women don’t act in haste.  Beyond the issue of comfort, let’s remember that high heels present a far more serious problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When they need to flee a dangerous situation, high heels can handicap their ability to escape.

How many needless deaths have resulted from hobbled feet?

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I once again urge the women of America to declare their independence from high-heeled shoes. 

If you’re currently thinking about returning to painful footwear, think again.  You’d be wiser to reconsider.

I encourage you to bravely gather any high heels you’ve clung to during the pandemic and throw those shoes away.  At the very least, please keep them out of sight in the back of your closet.  And don’t even think about buying new ones.  Shod yourself instead in shoes that allow you to walk in comfort—and if need be, to run.

Your wretched appendages, yearning to be free, will be forever grateful.

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

Hangin’ with Judge Hoffman

POST #8

This is the eighth in a series of posts that recall what it was like to serve as Judge Julius Hoffman’s law clerk from 1967 to 1969.

The “Chicago 7” Trial (continued)

            How did the Nixon victory lead to the trial of the “Chicago 7”?  The answer is simple.

             With prosecutions by the U.S. Justice Department shifting from the Johnson administration and its attorney general, Ramsey Clark, to those on Nixon’s team who began running the Justice Department, things changed dramatically. 

            AG Clark had been reluctant to go after antiwar activists.  But Nixon was a warped personality, bent on punishing those he viewed as his enemies.  Once in office, with his own attorney general, John Mitchell, securely installed, he could prod federal prosecutors to go after his perceived foes.

            With the assistance of the FBI, long under the direction of another warped individual, J. Edgar Hoover, Nixon was able to track down his enemies, including antiwar protestors who had militated against him.  At the Democratic convention in Chicago in August 1968, antiwar activists’ outspoken opposition to the ultimately successful nomination of Hubert Humphrey (who in their view had not supported their cause with sufficient enthusiasm) disrupted the convention and undermined Humphrey’s ability to defeat Nixon.  As I noted in Post #7, Humphrey’s popular vote total in November was only one percent short of Nixon’s.  But that one percent made all the difference in the now-notoriously-undemocratic Electoral College.

            Many of these protestors had opposed the Vietnam War even before 1968, and they promised to further disrupt things once Nixon was elected.  Hoover’s FBI moved on from targeting people like members of the Communist Party USA to antiwar activists.  A covert program, Cointelpro, used a wide range of “dirty tricks,” including illegal wiretaps and planting false documents. 

I’ll add a recent update on Cointelpro here.

A fascinating revelation appeared in the San Francisco Chronicle in 2021

            On March 7 of this year, The San Francisco Chronicle revealed an FBI break-in that underscores what the agency was doing at this time.  On March 8, 1971, Ralph Daniel, then 26, was one of eight antiwar activists who had long suspected FBI malfeasance and broke into a small FBI office in Pennsylvania to seize records that would prove it.  (March 8 was chosen because, they hoped, FBI agents would be focused on the title fight between prizefighters Ali and Frazier that night.) The break-in was successful, and the records uncovered were leaked to journalists and others, exposing Hoover’s secret FBI program that investigated and spied on citizens accused of engaging in protected speech. 

            This was the massive Cointelpro operation that had amassed files on antiwar activists, students, Black Panthers, and other Black citizens.  Fred Hampton, the leader of the Chicago Black Panthers, was one target of this operation. (He plays a small role in Aaron Sorkin’s film, “The Trial of the Chicago 7,” before his shocking murder is revealed during that trial.  I remember learning of Hampton’s murder and feeling sickened by the conduct of local law enforcement, whose homicidal wrongdoing later became apparent.)

            In 1975, the U.S. Senate’s Church Committee found the FBI program illegal and contrary to the Constitution.      Exposure of Cointelpro tarnished Hoover’s legacy and damaged the reputation of the FBI for years.

            The recent revelation appears in the March 7th edition of The San Francisco Chronicle.  Ralph Daniel, a resident of the Bay Area, revealed his story to a Chronicle reporter fifty years after the break-in took place.

The legal underpinnings of the trial of the “Chicago 7”

            With John Mitchell running Nixon’s Justice Department, federal prosecutors were instructed to focus on one section in a federal statute originally intended to penalize those who created civil unrest following the assassination of Martin Luther King Jr., and specifically to use that statute to bring charges against antiwar activists.  The statute, which had been enacted on April 11, 1968, was mostly a follow-up to the Civil Rights Act of 1964, and it applied to issues like fair housing and the civil rights of Native American tribes. 

            But Title X of this law, which became known as the Anti-Riot Act, did something quite different.  It made it a felony to cross states lines or make phone calls “to incite a riot; to organize, promote or participate in a riot; or to aid and abet any person performing these activities.” This provision, sometimes called the “H. Rap Brown Law,” was passed in response to the conduct of civil rights activist H. Rap Brown.  

How did Judge Hoffman become involved?

            In September 1968, shortly after the Chicago convention, the Chief Judge of the Northern District of Illinois, William J. Campbell, convened a grand jury to investigate possible charges against antiwar protestors who had been active during the convention.  The grand jury, which met 30 times over six months and heard about 300 witnesses, indicted the eight antiwar protestors who came to be dubbed the “Chicago 8” with a violation of the Anti-Riot Act.  AG John Mitchell then asked the U.S Attorney for the Northern District, Thomas Foran, to stay in office and direct the prosecution.

            In Hoffman’s chambers, I was unaware that any of this was happening.  But in the spring of 1969, Hoffman became the judge who would preside over the prosecution.

            Anyone could see from the very beginning that this case was a hot potato–such a hot potato that before it was assigned to Hoffman, it had bounced around the courthouse a couple of times.  Cases were supposed to be randomly assigned to judges according to a “wheel” in the clerk’s office.  But this time, the first two judges who’d been handed the case had reportedly sent it back.  One of these judges was Chief Judge Campbell.  I’m not sure about the other judge, but whoever he was, he had a lot more smarts than Hoffman did.

            [I had my own run-in with Judge Campbell, beginning in February 1970.  But that’s a story for another time.]

            When the case landed in Hoffman’s chambers, he seemed somewhat taken aback, but I think he may have been secretly pleased to be handed this case.  He might have even liked the idea that he’d be handling a high-profile prosecution that would draw a lot of attention.  In any event, his ego wouldn’t let him send the case back to “the wheel,” even on a pretext.

            I kept my distance from the “Chicago 8” case.  As Hoffman’s senior clerk, due to leave that summer, I wasn’t expected to do any work on it.  My co-clerk, at that time the junior clerk, would become the senior clerk after my departure, and he assumed responsibility for the pre-trial motions and other events related to the case.  I was frankly delighted to have little or no responsibility this case.  It was clearly dynamite, and Hoffman was clearly the wrong judge for it.

            Since I was still working in Hoffman’s chambers, I could of course observe what was happening there.  And I could see what was going to happen long before the trial began.  Attorneys for the eight defendants (who later became seven when defendant Bobby Seale’s case was severed, in a sadly shocking episode about a month after the trial began) immediately began filing pre-trial motions that contested absolutely everything. 

            As I recall, one pre-trial motion explicitly asked Hoffman to recuse himself (i.e., withdraw as judge).  The defense lawyers’ claim was that Hoffman’s conduct of previous trials showed that he couldn’t conduct this trial fairly.  If Hoffman had been smart, he would have seized upon this motion as a legitimate way to extract himself from the case.  He must have already suspected that things in his courtroom might not go well.  But again, his pride wouldn’t allow him to admit that there was anything in his history that precluded him from conducting a fair trial.

            Soon the national media began descending on the courtroom to report on Hoffman’s rulings on the pre-trial motions.  One day Hoffman came into the clerks’ room to show us a published article in which a reporter had described the judge as having a “craggy” face.  “What does ‘craggy’ mean?” he asked us. 

            My co-clerk and I were dumbfounded, wondering how to respond to such a bizarre question.  The word “craggy” had always sounded rather rugged to me, while Hoffman looked much more like the cartoon character Mr. Magoo (as many in the media soon began to describe him).  I muttered something about “looking rugged,” while my co-clerk stayed silent.  Hoffman looked dubious about my response and continued to harp on the possible definition of “craggy” for another five or ten minutes until he finally left.

            The problem with Hoffman’s treatment of the “Chicago 7′ case was, fundamentally, that he treated it like every other criminal case he’d ever handled.  And the defense attorneys were right.  He had a record of bias in favor of government prosecutors.

            This problem became his downfall.  He refused to see that this case was unique and had to be dealt with on its own terms, unlike all of the other criminal cases in his past. 

            Further, he lacked any flexibility and remained committed to the way he’d always conducted proceedings in his courtroom.  If he’d had some degree of flexibility, that might have helped the trial proceed more smoothly.  But at 74, after 16 years on the bench, he was accustomed to running an orderly courtroom with lawyers and defendants who followed the rules.

            He would not have an orderly courtroom this time, and he was completely unable to bend those rules.

The film, “The Trial of the Chicago 7,” written and directed by Aaron Sorkin

            This film, which first appeared in September 2020 (I’ll call it “the Sorkin film”), has made the trial the centerpiece of a lengthy and detailed dramatization of the trial itself, along with the events that led up to it. 

The film is an impressive achievement.  I applaud Sorkin for bringing attention to the 50-year-old trial and to many of the people and events who were part of it.

I’ve chosen not to critique the film but simply to add comments based on my own recollections from that era along with what I’ve gleaned from my independent research.

The Sorkin film has notably garnered a 90 percent positive score on Rotten Tomatoes, based on nearly 300 critics’ reviews.  Some of the reviews are glowing, others less so.

I’ll quote from a sampling of reviews.

A.O. Scott wrote in The New York Times:  The film is “talky and clumsy, alternating between self-importance and clowning.”

David Sims wrote in The Atlantic:  This is “a particularly shiny rendering of history, but Sorkin wisely [focuses] on America’s failings, even as he celebrates the people striving to fix them.”

Joe Morgenstern wrote in The Wall Street Journal:  The film “diminishes its aura of authenticity with dubious inventions” and “muddies its impact by taking on more history than it can handle.”

Sorkin’s overall themes are opposition to an unjust war, specifically the Vietnam War; the attempt by activists in 1968 to achieve what they viewed as justice and to strengthen democracy; and how all of this played out politically.  As A.O. Scott noted in his review, “the accident of timing” helped to bolster these themes, with “echoes of 1968” clear to most of us in 2020:  “the appeals to law and order, the rumors of radicals sowing disorder in the streets, the clashes between police and citizens.” 

Sorkin himself told an interviewer that protestors in 2020 got “demonized as being un-American, Marxist, communist—all things they called the Chicago 7.”  He added, “The movie is not intended to be a history lesson, or about 1968—it’s about today.”

(As I point out later in “A Brief Detour,” these themes also played out in Greece during the 1960s.)

In his screenplay, Sorkin sets the scene well.  He begins with news coverage noting LBJ’s escalation of troops and draft calls to beef up the war in Vietnam.  He includes a clip of Martin Luther King Jr. stating that the war was poisoning the soul of America.  He also highlights the assassination of Robert F. Kennedy, who had spoken out against the war, while at the same time noting the increase in casualties among the troops in Vietnam.

In addition, Sorkin makes clear that two of the Chicago 7 defendants, Tom Hayden and Rennie Davis, were leaders of SDS, an organization maintaining that the Vietnam War was contrary to our notions of social justice.  He also shows us Abbie Hoffman (hereinafter Abbie, to avoid confusion with the judge) and Jerry Rubin–who wanted to see either Senator Eugene McCarthy or Senator George McGovern nominated for the presidency– proclaiming that there wasn’t enough difference between Humphrey and Nixon to merit a vote for Humphrey.  (Gosh, this sounds familiar, doesn’t it?  It reminds me of Ralph Nader in 2000, proclaiming that there was no difference between Al Gore and George W. Bush. Thanks, Ralph, for helping to defeat Al Gore and giving us George W. Bush and the war in Iraq.) 

One more thing:  Abbie and Rubin claim in a clip that they’re going to the convention in Chicago “peacefully,” but “we’ll meet violence with violence.”

The film has deservedly won over a large number of admiring movie-watchers, but let’s be honest: Many if not most of them have little or no knowledge of the real story portrayed in it.

Sorkin’s screenplay received the Golden Globe award as the best screenplay of 2020, and it’s been nominated for an Oscar in that category.  The film has also been nominated for an Oscar as the Best Motion Picture of 2020.  One of its actors, Sacha Baron Cohen, is nominated for best supporting actor, and the film is nominated in three other Oscar categories.  In April, the cast received the Screen Actors Guild award for the Outstanding Performance by a Motion Picture Cast.

A few of my own comments

            As I’ve previously pointed out, in the spring of 1969 I was serving as Hoffman’s senior clerk.  I wasn’t responsible for advising him on his rulings during the trial (which began after my departure that summer), and I also didn’t take part in his rulings before the trial.  But it was impossible not to observe what was happening in his chambers while I was still working there.

            Although I therefore could observe what went on in Hoffman’s chambers, I was unaware of many of the events that were taking place outside of his chambers, and I don’t recall whether I personally observed any of the pre-trial courtroom appearances of the defense attorneys.  I also never observed the conduct of any of the defendants before the trial began, unless they appeared on local TV news coverage.

            For these reasons, I found much of the Sorkin film illuminating.  Although I’d very much like to know the sources Sorkin relied on in crafting his screenplay, I haven’t attempted to find out exactly what they were.  For proceedings in the courtroom both before and during the trial, I’m sure that Sorkin relied on the court transcript, which would have recorded everything said in court by the prosecutors, the defendants, defense counsel, the judge, and the many witnesses. 

            [Because of my own experience with court reporters, I know that not every word said in court is in fact recorded properly.  When I said during an oral argument (in a case called Doe v. Scott) that there was “no consensus” among medical experts regarding when life begins, the court reporter recorded my response as “no consequences.”  A very different word with a very different meaning in that context.  But in the trial of the “Chicago 7,” it’s probably safe to assume that the court reporter got most of the words right.]

            As for anything said outside of court, I’ll assume that Sorkin chose to rely on reputable sources.  I know, for example, that defense attorney William Kunstler published a book titled “My Life as a Radical Lawyer,” which probably provided helpful background for some of what happened (at least from Kunstler’s viewpoint).  Countless other books, interviews, and media accounts were no doubt researched and used to support scenes in the film.  Kudos to Sorkin if he and his staff perused these books and other background material for insights into what happened.

            I nevertheless want to ask, on my behalf as well as yours:

            How accurate is the film?

            Although Sorkin may have done a thorough job of research, there’s no question that he took considerable “creative license” when he wrote his screenplay.  He chose to emphasize certain events and to de-emphasize, revise, or omit others.  He also created totally new stuff to dramatize the story.

             For a review of what’s accurate and what’s not, I recommend two online articles.  One that strikes me as a careful job that squares with what I remember is “What’s Fact and What’s Fiction in The Trial of the Chicago 7” by Matthew Dessum, published on Oct. 15, 2020, in Slate.com.   A similar article appeared around the same time in smithsonianmag.com.

                                                To be continued

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Happy Holidays! Well, maybe…

 

As the greeting “Happy Holidays” hits your ears over and over during the holiday season, doesn’t it raise a question or two?

At a time when greed and acquisitiveness appear to be boundless, at least among certain segments of the American population, the most relevant questions seem to be:

  • Does money buy happiness?
  • If not, what does?

These questions have been the subject of countless studies.  Let’s review a few of the answers they’ve come up with.

To begin, exactly what is it that makes us “happy”?

A couple of articles published in the past two years in The Wall Street Journal—a publication certainly focused on the acquisition of money—summarized some results.

Wealth alone doesn’t guarantee a good life.  According to the Journal, what matters a lot more than a big income is how people spend it.  For instance, giving money away makes people much happier than spending it on themselves.  But when they do spend it on themselves, they’re a lot happier when they use it for experiences like travel rather than material goods.

The Journal looked at a study by Ryan Howell, an associate professor of psychology at San Francisco State University, which found that people may at first think material purchases offer better value for their money because they’re tangible and they last longer, while experiences are fleeting.  But Howell found that when people looked back at their purchases, they realized that experiences actually provided better value.  We even get more pleasure out of anticipating experiences than we do from anticipating the acquisition of material things.

Another psychology professor, Thomas Gilovich at Cornell, reached similar conclusions.  He found that people make a rational calculation:  “I can either go there, or I can have this.  Going there may be great, but it’ll be over fast.  But if I buy something, I’ll always have it.”  According to Gilovich, that’s factually true, but not psychologically true, because we “adapt to our material goods.”

We “adapt” to our material goods?  How?  Psychologists like Gilovich talk about “hedonic adaptation.”  Buying a new coat or a new car may provide a brief thrill, but we soon come to take it for granted.  Experiences, on the other hand, meet more of our “underlying psychological needs.”

Why?  Because they’re often shared with others, giving us a greater sense of connection, and they form a bigger part of our sense of identity.  You also don’t feel that you’re trying to keep up with the Joneses quite so much.  While it may bother you when you compare your material things to others’ things, comparing your vacation to someone else’s won’t bug you as much because “you still have your own experiences and your own memories.”

Another article in the Journal, published in 2015, focused on the findings of economists rather than psychologists.  A group of economists like John Helliwell, a professor at the University of British Columbia, concluded that happiness—overall well-being–should not be measured by how much money we have by using metrics like per-capita income and gross domestic product (GDP).  “GDP is not even a very good measure of economic well-being,” he said.

Instead, the World Happiness Report, which Helliwell co-authored, ranked countries based on how people viewed the quality of their lives. It noted that six factors account for 75 percent of the differences between countries.  The six factors:  GDP, life expectancy, generosity, social support, freedom, and corruption.  Although GDP and life expectancy relate directly to income, the other four factors reflect a sense of security, trust, and autonomy.  So although the U.S. ranked first in overall GDP, it ranked only 15th in happiness because it was weaker in the other five variables.

According to Jeffrey D. Sachs, a professor at Columbia and co-author of the World Happiness Report, incomes in the U.S. have risen, but the country’s sense of “social cohesion” has declined.  The biggest factor contributing to this result is “distrust.”  Although the U.S. is very rich, we’re not getting the benefits of all this affluence.

If you ask people whether they can trust other people, Sachs said, “the American answer has been in significant decline.”   Forward to 2017.  Today, when many of our political leaders shamelessly lie to us, our trust in others has no doubt eroded even further.

Even life expectancy is going downhill in the U.S.  According to the AP, U.S. life expectancy was on the upswing for decades, but 2016 marked the first time in more than a half-century that it fell in two consecutive years.

Let’s return to our original question:  whether money can buy happiness.  The most recent research I’ve come across is a study done at Harvard Business School, noted in the November-December 2017 issue of Harvard Magazine.  Led by assistant professor of business administration Ashley Whillans, it found that, in developed countries, people who trade money for time—by choosing to live closer to work, or to hire a housecleaner, for example–are happier. This was true across the socioeconomic spectrum.

According to Whillans, extensive research elsewhere has confirmed the positive emotional effects of taking vacations and going to the movies.  But the Harvard researchers wanted to explore a new ideawhether buying ourselves out of negative experiences was another pathway to happiness.

Guess what:  it was.  One thing researchers focused on was “time stress” and how it affects happiness.  They knew that higher-earners feel that every hour of their time is financially valuable.  Like most things viewed as valuable, time is also perceived as scarce, and that scarcity translates into time stress, which can easily contribute to unhappiness.

The Harvard team surveyed U.S., Canadian, Danish, and Dutch residents, ranging from those who earned $30,000 a year to middle-class earners and millionaires. Canadian participants were given a sum of money—half to spend on a service that would save one to two hours, and half to spend on a material purchase like clothing or jewelry.  Participants who made a time-saving purchase (like buying take-out food) were more likely to report positive feelings, and less likely to report feelings of time stress, than they did after their shopping sprees.

Whillans noted that in both Canada and the U.S., where busyness is “often flaunted as a status symbol,” opting for outsourcing jobs like cooking and cleaning can be culturally challenging.  Why?  Because people like to pretend they can do it all.  Women in particular find themselves stuck in this situation.  They have more educational opportunities and are likely to be making more money and holding more high-powered jobs, but their happiness is not increasing commensurately.

The Harvard team wants to explore this in the future.  According to Whillans, the initial evidence shows that among couples who buy time, “both men and women feel less pulled between the demands of work and home life,” and that has a positive effect on their relationship.  She hopes that her research will ameliorate some of the guilt both women and men may feel about paying a housekeeper or hiring someone to mow the law—or ordering Chinese take-out on Thursday nights.

Gee, Ashley, I’ve never felt guilty about doing any of that.  Maybe that’s one reason why I’m a pretty happy person.

How about you?

Whatever your answer may be, I’ll join the throng and wish you HAPPY HOLIDAYS!

 

 

 

 

 

Proms and “The Twelfth of Never”

It’s prom season in America.

Do you remember your senior prom?

The twelfth of June never fails to remind me of mine.

The prom committee named our prom “The Twelfth of Never,” and it’s easy to remember why.  The prom took place on June 12th.  The name was also that of a popular song recorded by Johnny Mathis–one of my favorites on his album, “Johnny’s Greatest Hits.”

As one of Johnny’s fans, I owned this album and played it over and over till I knew the words to all of the songs, including this one.  Many of his songs became standards, and PBS has recently been showcasing his music in one of its most appealing fund-raising lures.

I immortalized the song title in my own small way by writing in my novel Jealous Mistress that the protagonist, Alison Ross, hears it playing while she shops in her supermarket in 1981: “My fellow shoppers were gliding up and down the aisles of the Jewel, picking items off shelves to the tune of ‘The Twelfth of Never.’”

When I was 11 or 12, my favorite crooner was Eddie Fisher, who was then at the top of his game.  But by my last year of high school, I’d shifted my loyalties to Johnny Mathis and Harry Belafonte.  In addition to Johnny’s album, I treasured Belafonte’s astonishing “Belafonte” LP and played it, like Johnny’s, over and over, learning those words, too.

Although I wasn’t part of the prom committee (I was busy chairing the luncheon committee), and “the twelfth of never” referred to a date when something was never going to happen, I was okay with the name the committee chose.  My more pressing concern was who would be my date.  Would it be my current crush, a friend since first grade who’d metamorphosed into the man of my dreams?  (I hoped so.)  Would it be last year’s junior prom date?  (I hoped not.)  Who exactly would it be?

As luck would have it, an amiable and very bright classmate named Allen stepped forward and asked me to go to the prom.  I could finally relax on that score.  But we weren’t really on the same wave length.  When we went on a few other dates before prom, they became increasingly awkward.

On one date we saw “Some Like It Hot” at a filled-to-capacity downtown Chicago movie theater, where we sat in the last row of the balcony.  The film was terrific (it’s been judged the top comedy film of all time by the American Film Institute), and Allen clearly loved it.  His delight unfortunately ended in an ache or two.  When he heard the last line, spoken by Joe E. Brown to Jack Lemmon (“Well, nobody’s perfect”), Allen laughed uproariously, threw his head back, and hit it on the wall behind our seats.  I felt sorry for him—it must have hurt—but it was still pretty hard to stifle a laugh.  (I don’t think it hurt his brainpower, though.  As I recall, Allen went on to enroll at MIT.)

Although the bloom was off the rose by the time the prom came along, Allen and I went off happily together to dance on the ballroom floor of the downtown Knickerbocker Hotel, noted for the floor’s colored lights.  (The Knickerbocker spent the 1970s as the icky Playboy Towers but since then reverted to its original name.)  We then proceeded to celebrate some more by watching the remarkable ice-skating show offered on a tiny rink surrounded by tables filled with patrons, like a bunch of us prom-goers, at still another big hotel downtown.

Most of us were unknowingly living through an era of innocence.  For some of my classmates, the prom may have involved heavy kissing, but I doubt that much more than that happened.  In my case, absolutely nothing happened except for a chaste kiss at the end of the evening.

For better or worse, proms have evolved into a whole different scene.  In April, The Wall Street Journal noted that although the rules of prom used to be simple, they’re more complicated today.  At Boylan Catholic High School in Illinois, for example, a 21-page rulebook governs acceptable prom-wear.  Other schools require pre-approval of the prom dresses students plan to wear–in one school by a coach, in another by a three-person committee.

Administrators add new rules every year “to address new trends and safety concerns.” These have included banning canes, boys’ ponytails, and saggy pants, as well as two-piece dresses that might reveal midriffs and dresses with mesh cutouts that suggest bare skin.

But students have begun to revolt.  The students at Boylan Catholic have organized their own prom, arguing that the 21-page dress code contributed to body-shaming.  They point to a rule that states: “Some girls may wear the same dress, but due to body types, one dress may be acceptable while the other is not.”  A male student who helped organize Morp (the alternative prom) said that “girls were offended…. Somebody needed to step up and do something.”

At a school in Alabama, one student hoped to take his grandmother to his prom since she’d never been to one, but her age exceeded the maximum of 20, so she wasn’t allowed to go.  The student was “mad,” skipped the school prom, and celebrated at his grandmother’s home instead.  Not surprisingly, the school defended its rule, stating that it wanted to discourage students’ inviting older relatives who might present a safety issue by drinking alcohol:  “It just causes problems.”  But the school district later joined with a senior center to host an annual prom for senior citizens.  Presumably, Granny went to a prom after all.

According to the Journal, New York City students have another option altogether.  The New York Public Library hosts an annual free “Anti-Prom” in June for students 12 to 18, who can attend in any garb they choose.

In the Bay Area, another phenomenon has occurred:  “promposals”–photos and videos posted on social media in which one student asks another one to prom.  The San Francisco Chronicle views these as a way for kids “to turn themselves into YouTube, Twitter and Instagram sensations.”  In 2014, a boy trotted up to school on a horse, holding a sign that asked his girlfriend to “ride to prom” with him.  Last year, a kid built a makeshift “castle” and wrote a Shakespearean-style play to ask a friend to prom.  And in Berkeley, a boy choreographed a hip-hop dance routine with a bunch of other kids and performed it for his hoped-for date in front of 200 classmates.

In April, the Chronicle reported data on the national emergence of promposals.  From only 17 on Twitter in 2009, the number grew to 764,000 in 2015, while on YouTube, videos went from 56,000 in 2009 to 180,000 last year.  (Millions of teens also post pictures about the prom itself on Instagram.)  The promposal phenomenon may be dying down, with fewer elaborate ones noted this year at a school in Oakland.  But who knows?

One thing we know for certain:  The high school prom-scene has changed.

But even though things have changed, prom-goers today are still teenagers much like us when we went to prom, with all of the insecurities and anxieties that go along with being a teen.

For me, mostly-happy memories of “The Twelfth of Never” return every year on the twelfth of June.   Maybe mostly-happy, or not-so-happy, memories of your prom return every year as well.

As Johnny’s song reminds us, our memories of prom can endure for “a long, long time.”

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

You wouldn’t like me when I’m angry

We see anger all around us. And it’s worse than ever. As The New York Times recently noted, rudeness and bad behavior “have grown over the last decades.” The Times focused on rudeness and incivility by “mean bosses” who cause stress in the workplace, but the phenomenon is widespread, appearing almost everywhere.

Along with mean bosses, we’ve all witnessed incidents of “road rage.” These sometimes lead to fatal results. I can understand road rage because I’m susceptible to it myself, but I strive to keep it under control. (I’m usually satisfied by hurling vicious insults at other drivers that they fortunately can’t hear.)

As a pedestrian, I’m often angered by rude and careless drivers who nearly mow me down as I walk through a crosswalk. Fortunately, my rage is usually tempered by my silent riposte, “I’m walkin’ here,” Ratso Rizzo’s enduring phrase.

Other common examples of anger include parents’ frustration with their children’s behavior. You’ve probably seen parents going so far as to hit their children in public, unable to restrain their anger even when others are watching.

Can we deal with anger by seeking revenge? That tactic, unwisely adopted by the two enraged drivers in the Argentinian film “Wild Tales,” may be tempting, but it’s clearly not the answer. Why? Because being angry simply isn’t good for your health.

Although anger can be useful—helping the body prepare to fight or flee from danger–strong anger releases hormones, adrenaline and cortisol, into the bloodstream. These can trigger an increase in heart rate and blood pressure and problems metabolizing sugar (leading to still other problems).

According to the Times article, Robert M. Sapolsky, a Stanford professor and author of “Why Zebras Don’t Get Ulcers,” argues that when people experience even intermittent stressors like incivility for too long or too often, their immune systems pay the price. Major health problems, including cardiovascular disease, cancer, diabetes, and ulcers may result.

A host of medical researchers are not at all upset to tell you the results of their studies. “Anger is bad for just about everything we have going on physically,” according to Duke researcher Redford Williams, co-author of “Anger Kills: Seventeen Strategies for Controlling the Hostility That Can Harm Your Health.” Over time, he adds, chronic anger can cause long-term damage to the heart.

For example, new evidence suggests that people increase their risk for a heart attack more than eight times after an extremely angry episode. A study published in March 2015 revealed that patients who’d experienced intense anger had an 8.5 times greater risk of suffering a heart attack in the two hours after an outburst of intense anger than they would normally.

The study, published in the European Heart Journal: Acute Cardiovascular Care, focused on patients in a Sydney, Australia, hospital who’d been “very angry, body tense, maybe fists clenched, ready to burst,” or “enraged, out of control, throwing objects, hurting [themselves] or others.” Although those are instances of extreme anger, not a typical angry episode, the finding is useful nonetheless.

A review of nine other studies, including a combined 6,400 patients, found a higher rate of problems like strokes as well as heart attacks and irregular heartbeat.

According to a recent article in The Wall Street Journal, most doctors believe smoking and obesity pose greater heart risks than anger does. But someone with risk factors for heart trouble or a history of heart attack or stroke who is “frequently angry” has “a much higher absolute excess risk accumulated over time,” according to Elizabeth Mostofsky at Boston’s Beth Israel Deaconess Medical Center, who help lead the nine-study review.

As the Journal article noted, some older studies have suggested that anger may be linked to other unfavorable results: increased alcohol consumption, increased smoking, and greater caloric intake. One study also found that high levels of anger were associated with serious sleep disturbances.

How do we deal with all of this anger? Anger-management counselors like Joe Pereira, cited by the Journal, recommend ways to curb hostility. First, avoid assuming others are deliberately trying to harm or annoy you. Also learn to tolerate unfairness, and avoid having rigid rules about how others should behave. “The more rules we have, the more people are going to break them. And that makes us angry,” Pereira says.

Experts also advise taking a timeout when one is gripped by anger. Karina Davidson, director of the Center for Behavioral Cardiovascular Health at Columbia University Medical Center, advises those who are prone to shouting to tell others “I’m very [hotheaded and] say things that don’t help the situation. It would help me if I could have 10 minutes and then maybe we could work together to resolve the situation.”

Lawyers are people who deal with anger all the time. As long ago as ancient Rome, the poet Horace wrote that lawyers are “men who hire out their words and anger.” Today, lawyers not only confront angry clients but also have to manage anger stemming from their opponents and themselves.

An article in the June 2014 issue of California Lawyer noted that lawyers currently face “an epidemic of incivility contaminating…the profession.” The authors, Boston lawyer Russell E. Haddleton and Joseph A. Shrand , M.D. (author of “Outsmarting Anger”), noted that the California Supreme Court had just approved a revised oath of admission requiring that new lawyers commit to conducting themselves “at all times with dignity, courtesy, and integrity.”

Acknowledging that incivility will continue to crop up, the authors maintain that an angry lawyer is an ineffective advocate. They suggest a number of things lawyers can do to stay calm. Tips like these can help all of us.

Among their suggestions: Begin by recognizing the physical signs of anger, and think of ways to change the situation. Next, try to avoid being jealous of a talented adversary. Jealousy can cloud one’s vision and ignite anger. Finally, to defuse anger “in yourself, your opponent, the judge, jurors, or a witness,” they advise lawyers to aim for a calm demeanor that displays empathy, communicates clearly, and above all, shows respect for others.

“Respect” is the key watchword here. The authors argue that it gives lawyers an advantage by allowing them to use reason and common sense instead of rashly reacting to what goes on in a courtroom. Lawyers who reject angry responses and choose a respectful approach are better advocates. This approach can clearly help non-lawyers as well.

In the current Pixar film, “Inside Out,” an 11-year-old girl struggles with her emotions. The emotion of Anger (voiced by Lewis Black) sometimes tries to dominate, but the emotion of Joy (voiced by Amy Poehler) seeks to keep it under control, not letting it take over. This may be the answer for all of us. If we try to find the joy in our lives—the good things that make us happy–we can triumph over anger and all of the dangerous consequences that flow from it.

We don’t have to turn into a large green Hulk every time something angers us. Let’s try instead to emulate the non-angry side of the Hulk.

I plan to do just that. You’ll like me much better that way.

Take a hike

The lure of “the gym” has always escaped me. I’ve joined a few fitness centers in my day, but I consistently end up abandoning the gym and resorting to my preferred route to fitness: walking. Whenever possible, I walk and hike in the great outdoors.

A host of recent studies has validated my faith in the benefits of walking. And some of these benefits may surprise you.

First, being active is better for your health. Duh. We’ve all suspected that for a long time. But here’s a new finding: sitting may be the real problem. Studies show that the more you sit, the greater your risk for health problems. In a study of more than two thousand adults ages 60 and older, every additional hour a day spent sitting was linked to a 50 percent greater risk of disability. Even those who got some exercise but were sitting too much were more likely to get dumped in the pool of disabled people.

Dorothy Dunlop and her colleagues at Northwestern’s McCormick School of Engineering and Applied Science concluded that sitting seems to be a separate risk factor. Getting enough exercise is important, but it’s equally important not to be a couch potato the rest of the time. Their study appeared in the Journal of Physical Activity & Health in 2014.

Another study, published in Medicine & Science in Sports & Exercise, noted something else about prolonged sitting: taking “short walking breaks” at least once an hour may lessen or even prevent some of the adverse effects, especially on the cardiovascular system. When healthy young men sat for 3 hours without moving their legs, endothelial function—the ability of blood vessels to expand and contract—dropped significantly from the very beginning. But when they broke up their sitting time with slow 5-minute walks every 30 or 60 minutes, endothelial function did not decline.

Here’s another benefit: Exercise, including walking, can keep you from feeling depressed. A British study, reported in JAMA Psychiatry, followed over 11,000 people (initially in their early 20s) for more than 25 years. It found that the more physically active they were, the less likely they were to have symptoms of depression. For example, sedentary people who started exercising 3 times a week reduced their risk of depression 5 years later by almost 20 percent. The researchers concluded that being active “can prevent and alleviate depressive symptoms in adulthood.”

Ready for one more reason to walk? A study described in The Wall Street Journal in 2014 found that walking can significantly increase creativity. This is a brand new finding. In the past, studies have shown that after exercise, people usually perform better on tests of memory and the ability to make decisions and organize thoughts. Exercise has also been linked anecdotally to creativity: writers and artists have said for centuries that their best ideas have come during a walk. But now science supports that link.

Researchers at Stanford University, led by Dr. Marily Oppezzo, decided to test the notion that walking can inspire creativity. They gathered a group of students in a deliberately unadorned room equipped with nothing more than a desk and a treadmill. The students were asked to sit and complete “tests of creativity,” like quickly coming up with alternative uses for common objects, e.g., a button. Facing a blank wall, the students then walked on the treadmill at an easy pace, repeating the creativity tests as they walked. Result: creativity increased when the students walked. Most came up with about 60 percent more “novel and appropriate” uses for the objects.

Dr. Oppezzo then tested whether these effects lingered. The students repeated the test when they sat down after their walk on the treadmill. Again, walking markedly improved their ability to generate creative ideas, even when they had stopped walking. They continued to produce more and better ideas than they had before their walk.

When Dr. Oppezzo moved the experiment outdoors, the findings surprised her. The students who walked outside did come up with more creative ideas than when they sat, either inside or outside, but walking outside did not lead to more creativity than walking inside on the treadmill. She concluded that “it’s the walking that matters.”

So a brief stroll apparently leads to greater creativity. But the reasons for it are unclear. According to Dr. Oppezzo, “It may be that walking improves mood,” and creativity blooms more easily when one is happier. The study appeared in The Journal of Experimental Psychology: Learning, Memory, and Cognition in 2014.

In truth, I don’t need these studies to convince me to keep walking. It helps that I live in San Francisco, where the climate allows me to walk outside almost every day. Walking is much more challenging when you confront the snow and ice that used to accompany my walks in and around Chicago. So I’m not surprised that walkers in colder climes often resort to exercising indoors.

It also helps that San Francisco has recently been voted the second most walkable city in America. According to Walk Score, an organization that ranks the “walkability” of 2,500 cities in the U.S., SF placed just behind New York City as the most walkable major American city.

SF’s high score is especially impressive in light of the city’s hills. Although I avoid the steepest routes, I actually welcome a slight incline because it adds to my aerobic workout. Why use a Stairmaster in a gloomy gym when I can climb uphill enveloped in sunshine and cool ocean breezes?

But whether you walk indoors or out, do remember to walk! You’ll assuredly benefit health-wise. And you just may enhance your creativity quotient. Someday you may even find yourself writing a blog like this one.