Tag Archives: The New York Times

Declare Your Independence: Those high heels are killers

Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. 

I’ve revised this post in light of changes that have taken place during the past year.

My newly revised post follows:

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body. 

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.” 

During the past year, one dominated by the global pandemic, many women and men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  [Please see my post, “Two Words,” published July 15, 2020, focusing on wearing pants with elastic waists.]

In particular, many women abandoned the wearing of high heels.  Staying close to home, wearing comfortable clothes, they saw no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women chose to seek out venues requiring any sort of fancy clothes or footwear.  

As the pandemic has loosened its grip, at least in many parts of the country, some women have been tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing has been noted in publications like The New York Times and The Wall Street Journal.   In a recent story in the Times, one woman “flicked the dust off her…high-heeled lavender pumps” that she’d put away for months and got ready to wear them to a birthday gathering.  According to the Times, some are seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay.  I get it.  “Dressing up” may be your thing after more than a year of relying on sweatpants.  But “towering heels”?  They may look beautiful, they may be alluring….

BUT don’t do it!  Please take my advice and don’t return to wearing the kind of shoes that will hobble you once again..

Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I nevertheless managed to secure ample male attention. 

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a trend toward higher and higher heels, and I found it troubling.  I was baffled by women, especially young women, who bought into the mindset that they had to follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.  

When I’d watch TV, I’d see too many women wearing stilettos that forced them into the ungainly walk I briefly sported so long ago.  I couldn’t help noticing the women on late-night TV shows who were otherwise smartly attired and often very smart (in the other sense of the word), yet wore ridiculously high heels that forced them to greet their hosts with that same ungainly walk.  Some appeared to be almost on the verge of toppling over. 

On one of the last in-person Oscar Awards telecasts (before they became virtual), women tottered to the stage in ultra-high heels, often accompanied by escorts who kindly held onto them to prevent their embarrassing descent into the orchestra pit.

So…what about the women, like me, who adopted lower-heeled shoes instead?  I think we’ve been much smarter and much less likely to fall on our faces.

Foot-care professionals have soundly supported my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

A couple of years ago, the San Francisco Chronicle questioned Dr. Amol Saxena, a podiatrist and foot and ankle surgeon who practiced in Palo Alto (and assisted Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He did not endorse wearing 3-inch heels and pointed out that celebrities wear them for only a short time, not all day.  To ensure a truly comfortable shoe, he added, no one should go above a 1.5-inch heel.  If you insist on wearing higher heels, you should limit how much time you spend in them.

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of America’s major shoe-sellers, began to promote lower-heeled styles along with higher-heeled numbers.  I was encouraged because Nordstrom is a bellwether in the fashion world, and its choices can influence shoe-seekers.  At the same time, I wondered whether Nordstrom was reflecting what its shoppers had already told the stores’ decision-makers.  The almighty power of the purse—how shoppers were choosing to spend their money–probably played a big role.

But the pandemic may have completely changed the dynamics of shoe-purchasing.  Once we faced the reality of the pandemic, and it then stuck around for months, sales of high heels languished, “teetering on the edge of extinction,” according to the Times

Today, with the pandemic a somewhat less frightening presence in our lives, there are undoubtedly women who will decide to resurrect the high heels already in their closets.  They, and others, may be inspired to buy new ones, dramatically changing the statistics—and their well-being.

I hope these women don’t act in haste.  Beyond the issue of comfort, let’s remember that high heels present a far more serious problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When they need to flee a dangerous situation, high heels can handicap their ability to escape.

How many needless deaths have resulted from hobbled feet?

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I once again urge the women of America to declare their independence from high-heeled shoes. 

If you’re currently thinking about returning to painful footwear, think again.  You’d be wiser to reconsider.

I encourage you to bravely gather any high heels you’ve clung to during the pandemic and throw those shoes away.  At the very least, please keep them out of sight in the back of your closet.  And don’t even think about buying new ones.  Shod yourself instead in shoes that allow you to walk in comfort—and if need be, to run.

Your wretched appendages, yearning to be free, will be forever grateful.

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

Hangin’ with Judge Hoffman

POST #8

This is the eighth in a series of posts that recall what it was like to serve as Judge Julius Hoffman’s law clerk from 1967 to 1969.

The “Chicago 7” Trial (continued)

            How did the Nixon victory lead to the trial of the “Chicago 7”?  The answer is simple.

             With prosecutions by the U.S. Justice Department shifting from the Johnson administration and its attorney general, Ramsey Clark, to those on Nixon’s team who began running the Justice Department, things changed dramatically. 

            AG Clark had been reluctant to go after antiwar activists.  But Nixon was a warped personality, bent on punishing those he viewed as his enemies.  Once in office, with his own attorney general, John Mitchell, securely installed, he could prod federal prosecutors to go after his perceived foes.

            With the assistance of the FBI, long under the direction of another warped individual, J. Edgar Hoover, Nixon was able to track down his enemies, including antiwar protestors who had militated against him.  At the Democratic convention in Chicago in August 1968, antiwar activists’ outspoken opposition to the ultimately successful nomination of Hubert Humphrey (who in their view had not supported their cause with sufficient enthusiasm) disrupted the convention and undermined Humphrey’s ability to defeat Nixon.  As I noted in Post #7, Humphrey’s popular vote total in November was only one percent short of Nixon’s.  But that one percent made all the difference in the now-notoriously-undemocratic Electoral College.

            Many of these protestors had opposed the Vietnam War even before 1968, and they promised to further disrupt things once Nixon was elected.  Hoover’s FBI moved on from targeting people like members of the Communist Party USA to antiwar activists.  A covert program, Cointelpro, used a wide range of “dirty tricks,” including illegal wiretaps and planting false documents. 

I’ll add a recent update on Cointelpro here.

A fascinating revelation appeared in the San Francisco Chronicle in 2021

            On March 7 of this year, The San Francisco Chronicle revealed an FBI break-in that underscores what the agency was doing at this time.  On March 8, 1971, Ralph Daniel, then 26, was one of eight antiwar activists who had long suspected FBI malfeasance and broke into a small FBI office in Pennsylvania to seize records that would prove it.  (March 8 was chosen because, they hoped, FBI agents would be focused on the title fight between prizefighters Ali and Frazier that night.) The break-in was successful, and the records uncovered were leaked to journalists and others, exposing Hoover’s secret FBI program that investigated and spied on citizens accused of engaging in protected speech. 

            This was the massive Cointelpro operation that had amassed files on antiwar activists, students, Black Panthers, and other Black citizens.  Fred Hampton, the leader of the Chicago Black Panthers, was one target of this operation. (He plays a small role in Aaron Sorkin’s film, “The Trial of the Chicago 7,” before his shocking murder is revealed during that trial.  I remember learning of Hampton’s murder and feeling sickened by the conduct of local law enforcement, whose homicidal wrongdoing later became apparent.)

            In 1975, the U.S. Senate’s Church Committee found the FBI program illegal and contrary to the Constitution.      Exposure of Cointelpro tarnished Hoover’s legacy and damaged the reputation of the FBI for years.

            The recent revelation appears in the March 7th edition of The San Francisco Chronicle.  Ralph Daniel, a resident of the Bay Area, revealed his story to a Chronicle reporter fifty years after the break-in took place.

The legal underpinnings of the trial of the “Chicago 7”

            With John Mitchell running Nixon’s Justice Department, federal prosecutors were instructed to focus on one section in a federal statute originally intended to penalize those who created civil unrest following the assassination of Martin Luther King Jr., and specifically to use that statute to bring charges against antiwar activists.  The statute, which had been enacted on April 11, 1968, was mostly a follow-up to the Civil Rights Act of 1964, and it applied to issues like fair housing and the civil rights of Native American tribes. 

            But Title X of this law, which became known as the Anti-Riot Act, did something quite different.  It made it a felony to cross states lines or make phone calls “to incite a riot; to organize, promote or participate in a riot; or to aid and abet any person performing these activities.” This provision, sometimes called the “H. Rap Brown Law,” was passed in response to the conduct of civil rights activist H. Rap Brown.  

How did Judge Hoffman become involved?

            In September 1968, shortly after the Chicago convention, the Chief Judge of the Northern District of Illinois, William J. Campbell, convened a grand jury to investigate possible charges against antiwar protestors who had been active during the convention.  The grand jury, which met 30 times over six months and heard about 300 witnesses, indicted the eight antiwar protestors who came to be dubbed the “Chicago 8” with a violation of the Anti-Riot Act.  AG John Mitchell then asked the U.S Attorney for the Northern District, Thomas Foran, to stay in office and direct the prosecution.

            In Hoffman’s chambers, I was unaware that any of this was happening.  But in the spring of 1969, Hoffman became the judge who would preside over the prosecution.

            Anyone could see from the very beginning that this case was a hot potato–such a hot potato that before it was assigned to Hoffman, it had bounced around the courthouse a couple of times.  Cases were supposed to be randomly assigned to judges according to a “wheel” in the clerk’s office.  But this time, the first two judges who’d been handed the case had reportedly sent it back.  One of these judges was Chief Judge Campbell.  I’m not sure about the other judge, but whoever he was, he had a lot more smarts than Hoffman did.

            [I had my own run-in with Judge Campbell, beginning in February 1970.  But that’s a story for another time.]

            When the case landed in Hoffman’s chambers, he seemed somewhat taken aback, but I think he may have been secretly pleased to be handed this case.  He might have even liked the idea that he’d be handling a high-profile prosecution that would draw a lot of attention.  In any event, his ego wouldn’t let him send the case back to “the wheel,” even on a pretext.

            I kept my distance from the “Chicago 8” case.  As Hoffman’s senior clerk, due to leave that summer, I wasn’t expected to do any work on it.  My co-clerk, at that time the junior clerk, would become the senior clerk after my departure, and he assumed responsibility for the pre-trial motions and other events related to the case.  I was frankly delighted to have little or no responsibility this case.  It was clearly dynamite, and Hoffman was clearly the wrong judge for it.

            Since I was still working in Hoffman’s chambers, I could of course observe what was happening there.  And I could see what was going to happen long before the trial began.  Attorneys for the eight defendants (who later became seven when defendant Bobby Seale’s case was severed, in a sadly shocking episode about a month after the trial began) immediately began filing pre-trial motions that contested absolutely everything. 

            As I recall, one pre-trial motion explicitly asked Hoffman to recuse himself (i.e., withdraw as judge).  The defense lawyers’ claim was that Hoffman’s conduct of previous trials showed that he couldn’t conduct this trial fairly.  If Hoffman had been smart, he would have seized upon this motion as a legitimate way to extract himself from the case.  He must have already suspected that things in his courtroom might not go well.  But again, his pride wouldn’t allow him to admit that there was anything in his history that precluded him from conducting a fair trial.

            Soon the national media began descending on the courtroom to report on Hoffman’s rulings on the pre-trial motions.  One day Hoffman came into the clerks’ room to show us a published article in which a reporter had described the judge as having a “craggy” face.  “What does ‘craggy’ mean?” he asked us. 

            My co-clerk and I were dumbfounded, wondering how to respond to such a bizarre question.  The word “craggy” had always sounded rather rugged to me, while Hoffman looked much more like the cartoon character Mr. Magoo (as many in the media soon began to describe him).  I muttered something about “looking rugged,” while my co-clerk stayed silent.  Hoffman looked dubious about my response and continued to harp on the possible definition of “craggy” for another five or ten minutes until he finally left.

            The problem with Hoffman’s treatment of the “Chicago 7′ case was, fundamentally, that he treated it like every other criminal case he’d ever handled.  And the defense attorneys were right.  He had a record of bias in favor of government prosecutors.

            This problem became his downfall.  He refused to see that this case was unique and had to be dealt with on its own terms, unlike all of the other criminal cases in his past. 

            Further, he lacked any flexibility and remained committed to the way he’d always conducted proceedings in his courtroom.  If he’d had some degree of flexibility, that might have helped the trial proceed more smoothly.  But at 74, after 16 years on the bench, he was accustomed to running an orderly courtroom with lawyers and defendants who followed the rules.

            He would not have an orderly courtroom this time, and he was completely unable to bend those rules.

The film, “The Trial of the Chicago 7,” written and directed by Aaron Sorkin

            This film, which first appeared in September 2020 (I’ll call it “the Sorkin film”), has made the trial the centerpiece of a lengthy and detailed dramatization of the trial itself, along with the events that led up to it. 

The film is an impressive achievement.  I applaud Sorkin for bringing attention to the 50-year-old trial and to many of the people and events who were part of it.

I’ve chosen not to critique the film but simply to add comments based on my own recollections from that era along with what I’ve gleaned from my independent research.

The Sorkin film has notably garnered a 90 percent positive score on Rotten Tomatoes, based on nearly 300 critics’ reviews.  Some of the reviews are glowing, others less so.

I’ll quote from a sampling of reviews.

A.O. Scott wrote in The New York Times:  The film is “talky and clumsy, alternating between self-importance and clowning.”

David Sims wrote in The Atlantic:  This is “a particularly shiny rendering of history, but Sorkin wisely [focuses] on America’s failings, even as he celebrates the people striving to fix them.”

Joe Morgenstern wrote in The Wall Street Journal:  The film “diminishes its aura of authenticity with dubious inventions” and “muddies its impact by taking on more history than it can handle.”

Sorkin’s overall themes are opposition to an unjust war, specifically the Vietnam War; the attempt by activists in 1968 to achieve what they viewed as justice and to strengthen democracy; and how all of this played out politically.  As A.O. Scott noted in his review, “the accident of timing” helped to bolster these themes, with “echoes of 1968” clear to most of us in 2020:  “the appeals to law and order, the rumors of radicals sowing disorder in the streets, the clashes between police and citizens.” 

Sorkin himself told an interviewer that protestors in 2020 got “demonized as being un-American, Marxist, communist—all things they called the Chicago 7.”  He added, “The movie is not intended to be a history lesson, or about 1968—it’s about today.”

(As I point out later in “A Brief Detour,” these themes also played out in Greece during the 1960s.)

In his screenplay, Sorkin sets the scene well.  He begins with news coverage noting LBJ’s escalation of troops and draft calls to beef up the war in Vietnam.  He includes a clip of Martin Luther King Jr. stating that the war was poisoning the soul of America.  He also highlights the assassination of Robert F. Kennedy, who had spoken out against the war, while at the same time noting the increase in casualties among the troops in Vietnam.

In addition, Sorkin makes clear that two of the Chicago 7 defendants, Tom Hayden and Rennie Davis, were leaders of SDS, an organization maintaining that the Vietnam War was contrary to our notions of social justice.  He also shows us Abbie Hoffman (hereinafter Abbie, to avoid confusion with the judge) and Jerry Rubin–who wanted to see either Senator Eugene McCarthy or Senator George McGovern nominated for the presidency– proclaiming that there wasn’t enough difference between Humphrey and Nixon to merit a vote for Humphrey.  (Gosh, this sounds familiar, doesn’t it?  It reminds me of Ralph Nader in 2000, proclaiming that there was no difference between Al Gore and George W. Bush. Thanks, Ralph, for helping to defeat Al Gore and giving us George W. Bush and the war in Iraq.) 

One more thing:  Abbie and Rubin claim in a clip that they’re going to the convention in Chicago “peacefully,” but “we’ll meet violence with violence.”

The film has deservedly won over a large number of admiring movie-watchers, but let’s be honest: Many if not most of them have little or no knowledge of the real story portrayed in it.

Sorkin’s screenplay received the Golden Globe award as the best screenplay of 2020, and it’s been nominated for an Oscar in that category.  The film has also been nominated for an Oscar as the Best Motion Picture of 2020.  One of its actors, Sacha Baron Cohen, is nominated for best supporting actor, and the film is nominated in three other Oscar categories.  In April, the cast received the Screen Actors Guild award for the Outstanding Performance by a Motion Picture Cast.

A few of my own comments

            As I’ve previously pointed out, in the spring of 1969 I was serving as Hoffman’s senior clerk.  I wasn’t responsible for advising him on his rulings during the trial (which began after my departure that summer), and I also didn’t take part in his rulings before the trial.  But it was impossible not to observe what was happening in his chambers while I was still working there.

            Although I therefore could observe what went on in Hoffman’s chambers, I was unaware of many of the events that were taking place outside of his chambers, and I don’t recall whether I personally observed any of the pre-trial courtroom appearances of the defense attorneys.  I also never observed the conduct of any of the defendants before the trial began, unless they appeared on local TV news coverage.

            For these reasons, I found much of the Sorkin film illuminating.  Although I’d very much like to know the sources Sorkin relied on in crafting his screenplay, I haven’t attempted to find out exactly what they were.  For proceedings in the courtroom both before and during the trial, I’m sure that Sorkin relied on the court transcript, which would have recorded everything said in court by the prosecutors, the defendants, defense counsel, the judge, and the many witnesses. 

            [Because of my own experience with court reporters, I know that not every word said in court is in fact recorded properly.  When I said during an oral argument (in a case called Doe v. Scott) that there was “no consensus” among medical experts regarding when life begins, the court reporter recorded my response as “no consequences.”  A very different word with a very different meaning in that context.  But in the trial of the “Chicago 7,” it’s probably safe to assume that the court reporter got most of the words right.]

            As for anything said outside of court, I’ll assume that Sorkin chose to rely on reputable sources.  I know, for example, that defense attorney William Kunstler published a book titled “My Life as a Radical Lawyer,” which probably provided helpful background for some of what happened (at least from Kunstler’s viewpoint).  Countless other books, interviews, and media accounts were no doubt researched and used to support scenes in the film.  Kudos to Sorkin if he and his staff perused these books and other background material for insights into what happened.

            I nevertheless want to ask, on my behalf as well as yours:

            How accurate is the film?

            Although Sorkin may have done a thorough job of research, there’s no question that he took considerable “creative license” when he wrote his screenplay.  He chose to emphasize certain events and to de-emphasize, revise, or omit others.  He also created totally new stuff to dramatize the story.

             For a review of what’s accurate and what’s not, I recommend two online articles.  One that strikes me as a careful job that squares with what I remember is “What’s Fact and What’s Fiction in The Trial of the Chicago 7” by Matthew Dessum, published on Oct. 15, 2020, in Slate.com.   A similar article appeared around the same time in smithsonianmag.com.

                                                To be continued

RBG in ’72

Countless words have been, and will continue to be, written about the incomparable U.S. Supreme Court Justice Ruth Bader Ginsburg, who served on the high court for 27 years.

I will leave discussions of her tenure on the Court to others.

What I will do here is recount the one and only time I encountered her in person, at a law school conference, at a pivotal point in her career.  If you’re interested in learning about that encounter, please read on.

In September of 1972, I was a full-time faculty member at the University of Michigan (UM) Law School.  Notably, I was the only full-time faculty member who was a woman.

The law school had a desirable setting on the UM campus, whose multitude of elm trees were unfortunately denuded of leaves, thanks to Dutch elm disease. The law school buildings made up the stunning Law Quadrangle, featuring beautiful old buildings constructed in the English Gothic style.

My role on the faculty was to help first-year law students learn the basics of legal education:  how to analyze court rulings (the kind they would read in the books assigned to them in courses like Torts and Contracts); how to do their own research into case law; and how to write a readable legal document, like an appellate brief aimed at persuading an appellate court to decide in their favor.

I was one of four young lawyers hired to fill this role.  The three men and I each taught one-fourth of the first-year class.  As I recall, we got to choose our offices in the law school library, and I immediately chose a plum.  It was an enormous wood-paneled room with charming hand-blown stained glass windows.  One entered it via a stairway leading upstairs from the library’s impressive reading room.  I treasured my office and happily welcomed meeting with students there.  And I wonder, in light of renovations at the law school, whether that glorious office still exists.

At some point early that fall, I learned that a conference on “women and the law” would be held at the New York University School of Law in October.  This was a bold new area of law that most law schools didn’t consider worth their attention.  NYU was clearly an exception. 

The idea of the conference immediately grabbed my attention because I had a longstanding interest in its stated focus.  One reason why I had myself attended law school a few years before was that, beginning very early in my life, I was and remain concerned with achieving equity and justice, including equal rights for women.

This focus had led me to attend law school during the mid-’60s.  My first job was that of law clerk to a U.S. district judge in Chicago.  After finishing my clerkship, I became a practicing lawyer as a Reggie assigned to my first choice, the Appellate and Test Case Division of the Chicago Legal Aid Bureau.  [I discussed the Reggie program in a blog post, “The Summer of ’69,” published on August 7, 2015.]

And so, three years earlier, in October of 1969, I had begun working on a lawsuit that had a significant bearing on women’s rights because it would challenge the constitutionality of Illinois’s restrictive abortion law. This law had an enormous impact on the lives of women, especially poor and non-white women.

I worked with Sybille Fritzsche, a lawyer with the ACLU in Chicago, who became my close friend.  Sybille and I spent months preparing our case.  We filed our lawsuit in February 1970, argued it before a three-judge federal court in September, and won a 2-to-1 ruling in our favor in January 1971.  (The ruling in that case, Doe v. Scott, and the events leading up to it, are the focus of a book I’m currently writing.  In the meantime, you can read about our case in historian Leslie Reagan’s prize-winning book, When Abortion Was a Crime.)

Now, in the fall of 1972, I learned about the conference at NYU.  Because I was extremely interested in attending it, I decided to ask the UM law school’s dean, Theodore St. Antoine, whether the school might send me to New York to attend it.  I thought I had a pretty persuasive argument:  I was the only full-time woman on the law school faculty.  Didn’t the dean think it would be a good idea to send me to represent UM at the conference? 

How could he say “no”?  Ted thought about for a moment, then gave his approval.  So off I went, my expenses paid by the kind patrons of UM. 

My hotel, the Fifth Avenue Hotel, located near NYU’s law school, had sounded appealing on paper, but it turned out to be something of a dump.  It suited me just fine, however, because I barely spent any time there.  I was too busy attending the conference sessions and, when I could, taking a short break to reconnect with a couple of law-school classmates and briefly sample life in New York City, a city light-years removed from less-than-exhilarating Ann Arbor, Michigan.

The conference, held on October 20-21, turned out to be a symposium sponsored by AALS (the American Association of Law Schools), “The AALS Symposium on the Law School Curriculum and the Legal Rights of Women.”  It featured a number of prominent speakers, mostly law professors and practicing lawyers who had turned their attention to “the legal rights of women” in areas like tax law, property law, and criminal law.  I attended most of these sessions, and each of them was excellent.

But the only session I was really excited about was a talk by someone named Ruth Bader Ginsburg.  I was quite certain that I would relish hearing her talk, “Toward Elimination of Sex-Based Discrimination: Constitutional Aspects,” because the topic was right down my alley.

Looking back, I don’t think I knew anything about RBG at the time.  But when she was introduced (by NYU dean Robert McKay) and began to speak, I was riveted by every word she uttered.  She spelled out everything she had already done and planned to do to achieve gender-equity.

So although I was not already familiar with her, I knew immediately that she clearly was and would continue to be a brilliant leader in the field of women’s rights.  I filed her name away in my memory so I could follow whatever she would do in the coming years.  And I did just that, enthusiastically following the many astounding accomplishments she achieved after 1972.

Your image of RBG may be that of the frail, petite woman who took center stage in our culture in her 80s.  But the RBG I saw in 1972 was very different.  She was an amazingly attractive young woman of 39.  You can see photos of her at that time in The New York Times of September 18 (in Linda Greenhouse’s long review of her life and career) and in a recent issue of TIME magazine (Oct. 5-12, 2020). Although much has been made of her short stature (one I share), she was so very energetic and focused that one quickly forgot how small she was.

It turned out that she had attended Harvard Law School about a decade before I did.  Like her, I’ve been called a “trailblazer” and a “pioneer,” and I also confronted gender-bias at every turn throughout my life.  My path was only a bit less rocky than hers:  My class at HLS included the whopping number of 25 women in a class of 520, while hers had only 9.

I’ve since learned that October 1972 marked a pivotal time in RBG’s career.  She had just switched her teaching position from Rutgers Law School to Columbia Law School (a considerable upgrade).  And she had just assumed another new position:  Director of the Women’s Rights Project at the ACLU, a project she had helped to found a short time before. 

So I’m left wondering…did she know about the case Sybille (an ACLU attorney in Chicago) and I brought in February 1970, a case that put a woman’s right to reproductive choice front and center?

RBG was an ardent supporter of reproductive rights during her tenure on the Supreme Court.  She discussed her views on abortion and gender equality in a 2009 New York Times interview, where she said “[t]he basic thing is that the government has no business making that choice for a woman.”

But I know that she had also stated that she wasn’t entirely happy with the way in which Roe v. Wade gave every woman in the U.S. that choice by bringing cases like Doe v. Scott in the federal courts.  She stated that she would have preferred that the argument had been made, over time, in each state’s legislature, with the right to choose being gradually adopted in that way rather than in one overriding court ruling that included every state.

Notably, on the 40th anniversary of the court’s ruling in Roe v. Wade, she criticized the decision because it terminated “a nascent democratic movement to liberalize abortion laws” that might have built “a more durable consensus” in support of abortion rights.

She had a point.  A democratic movement to liberalize abortion laws would have been the ideal route, and might have been a less contentious route, to achieving abortion rights throughout the country. 

But I think her position was influenced by her own life story. 

It stemmed, at least in part, from the fact that in April 1970, she was living and working in New York, where the state legislature had passed a new law allowing abortion, and New York Governor Nelson Rockefeller had signed it on April 11, 1970.  New York became only the second state in the U.S. (after Hawaii) to permit abortion, and only a few other states had carved out any sort of exception to what was otherwise a nationwide ban on abortion.

RBG may have optimistically believed that other states would follow New York’s lead.  But history has proved otherwise.

If women had waited for each of the 50 states to accomplish the goal of women’s reproductive choice, I think we’d still have many states refusing to enact laws allowing choice.  In support of my view, I ask readers to consider the situation today, when some states are trying to restrict abortion so frenetically, with or without achieving a complete ban, that they’re now simply waiting for a far-right conservative Court to overturn Roe v. Wade.

Whether or not RBG was aware of what was happening in the courtrooms of Chicago in 1970, I think I could have persuaded her that Sybille and I were doing the right thing.  

By advocating that the federal district court hold that the restrictive Illinois abortion law was unconstitutional, and persuading the court to decide in our favor, we achieved our goal of saving the lives and health of countless women who would have otherwise suffered from their inability to obtain a legal and medically safe abortion.

What greater achievement on behalf of women’s rights could there have been? 

I like to think that, after hearing my argument, RBG would have approved.

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Looking Back…The Election of 1984

If you’ve followed politics for as long as I have, you probably remember the election of 1984.  In the race for U.S. president, Ronald Reagan was the Republican incumbent, first elected in 1980, and seeking to be re-elected in 1984.  Most observers predicted that he would succeed.

Opposing him was the Democratic nominee, Walter Mondale.

I found the campaign for president so absorbing that shortly after Mondale lost, I wrote a piece of commentary on the election.  Somewhat astoundingly, I recently came across that long-lost piece of writing.

Regrettably, I never submitted it for publication.  Why?  In 1984 I was active in local politics (the New Trier Democratic Organization, to be specific), and I was apprehensive about the reaction my comments might inspire in my fellow Democrats.

Reviewing it now, I wish I’d submitted it for publication.

On June 11th of this year, after Hillary Clinton appeared to be the Democratic nominee for president, The New York Times published a front-page story by Alison Mitchell, “To Understand Clinton’s Moment, Consider That It Came 32 Years After Ferraro’s.”  Mitchell’s article is a brilliant review of what happened in 1984 and during the 32 years since.  My commentary is different because it was actually written in 1984, and it presents the thinking of a longstanding political observer and a lifelong Democrat at that point in time.

Here’s the commentary I wrote just after the election in November 1984.  It was typed on an Apple IIe computer (thanks, Steve Wozniak) and printed on a flimsy dot-matrix printer.  It’s almost exactly what I wrote back then, minimally edited, mostly to use contractions and omit completely unnecessary words.  I’ve divided it into two parts because of its length.

 

PART I

Although Walter Mondale conducted a vigorous and courageous campaign, perhaps nothing he did or did not do would have altered the ultimate result.  But his fate was probably sealed last July when he made two costly political mistakes.  He chose to tell the American people that he’d increase taxes, and he chose Geraldine Ferraro as his running mate.

Savvy political observers have always known that talk of increased taxes is the kiss of death for any candidate.  One wonders what made Walter Mondale forget this truism and instead decide to impress the electorate with his honesty by telling them what they had to know (or, rather, what he thought they had to know) about the deficit.  By making the deficit—a highly intangible concept to the average American voter—a cornerstone of his campaign, Mondale committed the political gaffe of the decade.  One can imagine the glee in the White House the night Mondale gave his acceptance speech and tipped his hand.  The most popular theme of the Reagan campaign became identifying Mondale with the idea of “tax, tax, tax; spend, spend, spend,” a theme that had spelled doom for Jimmy Carter and came to do the same for his Vice President.

Mondale’s choice of Geraldine Ferraro as his running mate was surely not a gaffe of the magnitude of his promise to increase taxes, but as a political judgment it was almost equally unwise.  Mondale faced a popular incumbent president.  All the signposts, even back in July, indicated that the American people were largely satisfied with Reagan and willing to give him another term.  To unseat a popular sitting president, Mondale—who’d been through a bloody primary campaign and emerged considerably damaged—had to strengthen his ticket by choosing a running mate with virtually no liabilities.  He simply couldn’t afford them.

Some of the best advice Mondale got all year was George McGovern’s suggestion that he choose Gary Hart for his vice president.  In one stroke, Mondale could have won the support of those backing his most formidable opponent, many of whom had threatened to go over to Reagan if their candidate wasn’t nominated.  Like Reagan in 1980, Mondale could have solidified much of the divided loyalty of his party behind him by choosing the opponent who’d come closest in arousing voters’ enthusiasm.  Instead he chose to pass over Hart and several other likely candidates and to select a largely unknown three-term congresswoman from New York City.

It pains me, as a feminist and an ardent supporter of women’s rights, to say this, but it must be said:  Mondale’s choice of Ferraro, however admirable, was a political mistake.  When the pressure from NOW and others to choose a woman candidate arose and gradually began to build, I felt uneasy.  When Congresswoman Patricia Schroeder (for whom I have otherwise unlimited respect) announced that if Mondale didn’t choose Hart, he had to choose a woman, my uneasiness increased.  And when Mondale at last announced his choice of Ferraro, my heart sank.  I was personally thrilled that a woman was at last on a national ticket, but I knew immediately that the election was lost, and that everything a Mondale administration might have accomplished in terms of real gains for women had been wiped out by his choice of a woman running-mate.

There was no flaw in Ferraro herself that ensured the defeat of the Mondale-Ferraro ticket.  She’s an extremely bright, attractive, competent congresswoman and proved herself to be a gifted and inspiring V.P. candidate.  She has, by accepting the nomination, carved out a secure place for herself in the history books and maybe a significant role in national politics for decades to come.  She deserves all this and perhaps more.  But one must wonder whether even Ferraro in her own secret thoughts pondered the political wisdom of her choice as Mondale’s running mate.  If she is as good a politician as I think she is, I can’t help thinking that she herself must have wondered, “Why me, when he could have anyone else?  Will I really help the ticket? Well, what the hell, I’ll give it a shot!  It just might work.”

And it just might—someday.  But in 1984, up against a “Teflon President,” Mondale needed much more.  Reagan was playing it safe, and Mondale wasn’t.  Some observers applauded his choice of Ferraro as the kind of bold, courageous act he needed to bring excitement to a dull, plodding campaign.  But American voters weren’t looking for bold and courageous acts.  They wanted a President who didn’t rock the boat–a boat with which they were largely satisfied.  They might have been willing to throw out the current occupant of the White House if Mondale had been able to seize upon some popular themes and use them to his advantage.  Instead, the Reagan administration seized upon the tax-and-spend issue and the relatively good status of the economy to ride to victory while Mondale was still groping for a theme that might do the same for him.  And all the while he had a running mate with a liability:  a woman who had no national political stature and who turned out to have considerable problems of her own (notably, a messy financial situation).

Mondale’s choice of Ferraro was compared by Reagan to his appointment of Sandra Day O’Connor to the U.S. Supreme Court.  In the sense that both men selected highly capable but little-known women and in one stroke catapulted them to the top of their professions, Reagan was right.  But Reagan’s choice was very different and politically much smarter.  A V.P. candidate must be judged by the entire American electorate; a Supreme Court nominee is judged only by the U.S. Senate.  A vice president must stand alone, the metaphorical heartbeat away from the presidency; a Supreme Court justice is only one of nine judges on a court where most issues are not decided 5 to 4.  [We all recognize that this description of the Court in 1984 no longer fits in 2016.  But a single justice on the Court is still only one of nine.]

Let’s face it:  the notion of a woman V.P. (and the concomitant possibility of a woman president) is one that some Americans are clearly not yet comfortable with.  Although 16 percent of the voters polled by one organization said that they were more inclined to vote for Mondale because of Ferraro, 26 percent said they were less likely to.  It doesn’t take a mathematical whiz to grasp that 26 is more than 16.  These statistics also assume that the 55 percent who said that Ferraro’s sex was not a factor either way were being absolutely candid, which is doubtful.  Many men and women who are subconsciously uncomfortable with the idea of a woman president are understandably reluctant to admit it, to themselves perhaps as much as to others.

 

 

 

You wouldn’t like me when I’m angry

We see anger all around us. And it’s worse than ever. As The New York Times recently noted, rudeness and bad behavior “have grown over the last decades.” The Times focused on rudeness and incivility by “mean bosses” who cause stress in the workplace, but the phenomenon is widespread, appearing almost everywhere.

Along with mean bosses, we’ve all witnessed incidents of “road rage.” These sometimes lead to fatal results. I can understand road rage because I’m susceptible to it myself, but I strive to keep it under control. (I’m usually satisfied by hurling vicious insults at other drivers that they fortunately can’t hear.)

As a pedestrian, I’m often angered by rude and careless drivers who nearly mow me down as I walk through a crosswalk. Fortunately, my rage is usually tempered by my silent riposte, “I’m walkin’ here,” Ratso Rizzo’s enduring phrase.

Other common examples of anger include parents’ frustration with their children’s behavior. You’ve probably seen parents going so far as to hit their children in public, unable to restrain their anger even when others are watching.

Can we deal with anger by seeking revenge? That tactic, unwisely adopted by the two enraged drivers in the Argentinian film “Wild Tales,” may be tempting, but it’s clearly not the answer. Why? Because being angry simply isn’t good for your health.

Although anger can be useful—helping the body prepare to fight or flee from danger–strong anger releases hormones, adrenaline and cortisol, into the bloodstream. These can trigger an increase in heart rate and blood pressure and problems metabolizing sugar (leading to still other problems).

According to the Times article, Robert M. Sapolsky, a Stanford professor and author of “Why Zebras Don’t Get Ulcers,” argues that when people experience even intermittent stressors like incivility for too long or too often, their immune systems pay the price. Major health problems, including cardiovascular disease, cancer, diabetes, and ulcers may result.

A host of medical researchers are not at all upset to tell you the results of their studies. “Anger is bad for just about everything we have going on physically,” according to Duke researcher Redford Williams, co-author of “Anger Kills: Seventeen Strategies for Controlling the Hostility That Can Harm Your Health.” Over time, he adds, chronic anger can cause long-term damage to the heart.

For example, new evidence suggests that people increase their risk for a heart attack more than eight times after an extremely angry episode. A study published in March 2015 revealed that patients who’d experienced intense anger had an 8.5 times greater risk of suffering a heart attack in the two hours after an outburst of intense anger than they would normally.

The study, published in the European Heart Journal: Acute Cardiovascular Care, focused on patients in a Sydney, Australia, hospital who’d been “very angry, body tense, maybe fists clenched, ready to burst,” or “enraged, out of control, throwing objects, hurting [themselves] or others.” Although those are instances of extreme anger, not a typical angry episode, the finding is useful nonetheless.

A review of nine other studies, including a combined 6,400 patients, found a higher rate of problems like strokes as well as heart attacks and irregular heartbeat.

According to a recent article in The Wall Street Journal, most doctors believe smoking and obesity pose greater heart risks than anger does. But someone with risk factors for heart trouble or a history of heart attack or stroke who is “frequently angry” has “a much higher absolute excess risk accumulated over time,” according to Elizabeth Mostofsky at Boston’s Beth Israel Deaconess Medical Center, who help lead the nine-study review.

As the Journal article noted, some older studies have suggested that anger may be linked to other unfavorable results: increased alcohol consumption, increased smoking, and greater caloric intake. One study also found that high levels of anger were associated with serious sleep disturbances.

How do we deal with all of this anger? Anger-management counselors like Joe Pereira, cited by the Journal, recommend ways to curb hostility. First, avoid assuming others are deliberately trying to harm or annoy you. Also learn to tolerate unfairness, and avoid having rigid rules about how others should behave. “The more rules we have, the more people are going to break them. And that makes us angry,” Pereira says.

Experts also advise taking a timeout when one is gripped by anger. Karina Davidson, director of the Center for Behavioral Cardiovascular Health at Columbia University Medical Center, advises those who are prone to shouting to tell others “I’m very [hotheaded and] say things that don’t help the situation. It would help me if I could have 10 minutes and then maybe we could work together to resolve the situation.”

Lawyers are people who deal with anger all the time. As long ago as ancient Rome, the poet Horace wrote that lawyers are “men who hire out their words and anger.” Today, lawyers not only confront angry clients but also have to manage anger stemming from their opponents and themselves.

An article in the June 2014 issue of California Lawyer noted that lawyers currently face “an epidemic of incivility contaminating…the profession.” The authors, Boston lawyer Russell E. Haddleton and Joseph A. Shrand , M.D. (author of “Outsmarting Anger”), noted that the California Supreme Court had just approved a revised oath of admission requiring that new lawyers commit to conducting themselves “at all times with dignity, courtesy, and integrity.”

Acknowledging that incivility will continue to crop up, the authors maintain that an angry lawyer is an ineffective advocate. They suggest a number of things lawyers can do to stay calm. Tips like these can help all of us.

Among their suggestions: Begin by recognizing the physical signs of anger, and think of ways to change the situation. Next, try to avoid being jealous of a talented adversary. Jealousy can cloud one’s vision and ignite anger. Finally, to defuse anger “in yourself, your opponent, the judge, jurors, or a witness,” they advise lawyers to aim for a calm demeanor that displays empathy, communicates clearly, and above all, shows respect for others.

“Respect” is the key watchword here. The authors argue that it gives lawyers an advantage by allowing them to use reason and common sense instead of rashly reacting to what goes on in a courtroom. Lawyers who reject angry responses and choose a respectful approach are better advocates. This approach can clearly help non-lawyers as well.

In the current Pixar film, “Inside Out,” an 11-year-old girl struggles with her emotions. The emotion of Anger (voiced by Lewis Black) sometimes tries to dominate, but the emotion of Joy (voiced by Amy Poehler) seeks to keep it under control, not letting it take over. This may be the answer for all of us. If we try to find the joy in our lives—the good things that make us happy–we can triumph over anger and all of the dangerous consequences that flow from it.

We don’t have to turn into a large green Hulk every time something angers us. Let’s try instead to emulate the non-angry side of the Hulk.

I plan to do just that. You’ll like me much better that way.