Category Archives: Uncategorized

Those tempting holiday treats

December means one delicious holiday treat after another.  We’re all tempted to indulge.  But before you start munching, you might want to know the results of a couple of studies related to those holiday sweets.

First, if you love chocolate, you may already be aware of the virtues of dark chocolate.  But an important new study has just confirmed that only dark chocolate is associated with lowering the risk of developing diabetes.  This 30-year-long study, conducted at the Harvard Chan School Department of Nutrition, focused on almost 200,000 people who started out free of diabetes. When the study ended, nearly 20,000 had been diagnosed with Type 2 diabetes. A lot of them reported specifically on their dark and milk chocolate intake.

It’s interesting, first of all, that those who ate at least 5 ounces of any kind of chocolate had a 10% lower risk of developing T2 diabetes than people who rarely or never ate chocolate.  But significantly, dark chocolate had a much bigger impact than milk chocolate.  Participants who ate dark chocolate had a 21% lower risk, with a 3% reduction in risk for every serving of dark chocolate eaten in a week.

At the same time, milk chocolate was NOT associated with reduced risk even though it has a similar level of calories and saturated fat.  Why?  According to the researchers, it’s the polyphenols in dark chocolate that may offset the effects of fat and sugar.

So before you bite into a mouthwatering chocolate dessert, try to find one made of dark chocolate.  I’ve been sampling some new dark chocolate candy bars, and they’re delicious.  It’s really no great hardship to switch from milk chocolate to dark.

You might also want to know about new research into one feature of the sweets we love:  their frequent dependence on high-fructose corn syrup.

Scientists at Washington University in St. Louis have found that dietary fructose promotes the tumor growth of certain cancers in animal models.  The finding in this study, published December 4 in the journal Nature, could open up new avenues for care and treatment of many types of cancer.

“The idea that you can tackle cancer with diet is intriguing,” said Gary Patti, a professor of chemistry, genetics, and medicine at the WashU School of Medicine.  The culprit seems to be fructose, which is similar to glucose.  Both are types of sugar, but the body seems to metabolize them differently.  Both are found naturally in fruits, vegetables, dairy products, and grains, and both are added as sweeteners in many processed foods. But the food industry has favored fructose because it’s sweeter. 

Consumption of fructose has escalated dramatically since the 1960s, and Patti pointed out that the number of items in your pantry that contain high-fructose corn syrup, the most common form of fructose, is “pretty astonishing.”  “Almost everything has it,” he added.  This includes foods like pasta sauce, salad dressing, and ketchup.  “Unless you actively seek to avoid it, it’s probably part of your diet.”

The problem is that fructose apparently impacts the growth of tumors.  I’ll skip the technical stuff, but what’s important is that we should avoid dietary fructose as much as we can.  While investigators at WashU Medicine and elsewhere around the world continue to look into possible connections between the surge in fructose consumption and the increasing prevalence of cancers among people under the age of 50, let’s try to avoid this problem.

Here’s my advice:  If you plan to indulge in some yummy holiday treats, try to find those made with dark chocolate and those that don’t include high-fructose corn syrup.  If you can.

Happy holidays!

JFK

Today is November 22, a day forever marked by an American tragedy.  On this day in 1963, President John F. Kennedy was assassinated in Dallas, Texas.

As a young kid, I was inspired by Kennedy’s appearance in my world when the media focused on his candidacy for the vice-presidential nomination at the 1956 Democratic Convention.  A vivid contrast to Presidents Truman and Eisenhower, he was a youthful and vigorous U.S. Senator who advocated positive changes in our country.  Along with many others in my generation, the emergence of JFK on the political scene intensified my interest in American politics.  Later that year, my sister gave me a copy of “Profiles in Courage,” Kennedy’s book about political heroes in American history.  I treasured that book and eagerly read and re-read it.  Over the years, I’ve continued to collect books about JFK.  My collection includes my original copy of “Profiles in Courage.”

After his election as President in 1960, Kennedy continued to inspire me.  And on June 11, 1963, he spoke out in favor of equal justice for all Americans.  I had returned to my home in Chicago after my college graduation at WashU in time to watch the televised speech he gave that day.

JFK began by noting Alabama Governor George Wallace‘s refusal, despite a court order, to allow the admission of two Black students to the University of Alabama.  He went on to say that “difficulties over segregation and discrimination exist in every city, in every State of the Union, producing in many cities a rising tide of discontent that threatens the public safety.”  This statement, and others, were important.  But I was mainly moved by these words: “The heart of the question is whether all Americans are to be afforded equal rights and equal opportunities, whether we are going to treat our fellow Americans as we want to be treated.”  After noting the special problem of racial discrimination, he added: “[T]his Nation, for all its hopes and all its boasts, will not be fully free until all its citizens are free.”

He said he planned to ask Congress to enact legislation giving all Americans “the right to be served in facilities…open to the public,” including hotels and restaurants, and to authorize the federal government “to participate more fully in lawsuits designed to end segregation in public education” and “greater protection for the right to vote.”  (His efforts eventually led to the enactment of the Civil Rights Act of 1964.)  He closed by asking for “the support of all our citizens.”

I sat transfixed in front of the TV, totally in awe of this speech, and I became an ardent supporter of the same ideals. 

Thanks to JFK, as a young person I developed a consuming interest in politics, and I began to think about a future where I could be involved in politics in some way.  One possible path occurred to me:  Attending law school and becoming a lawyer.  As I wrote in my handwritten journal in 1958, “I have developed a keen interest in law, and at the moment, I am busily planning to study law if possible.  At one time I believed I would be a writer….  Now, law and politics beckon, and…I am trying to convince myself that nothing is impossible and that if I want it badly enough, I will get it!”  Still a teenager, I wasn’t ready to make the leap to law school, but I did look forward to a future somehow focused on government and politics.  So I majored in political science in college and went on to be a graduate student in that field before abandoning it in favor of law school.

JFK’s assassination on November 22, 1963, traumatized me and probably most other Americans at the time.  It was truly shocking.  Looking back, I realize just how much it affected me.  As I wrote in my handwritten journal on the day after he was assassinated: “When the news of [his] death … was announced, I was too stunned to cry, too horrified to do much of anything but say the words echoed over and over by seemingly everyone…. I can’t believe it!  It’s incredible!  How could anyone do such a thing? And why?”  I added: “I was mourning the personal loss of an individual who had brought such vigor, such excitement, such brilliance, such intelligence, such energy…to everything he ever did in his life.  [He] was a personal icon to me, a hero, a leader to follow…who has always stood, in my eyes, for everything that was right in politics and government, and in the pursuit of power for noble aims, and who, I am certain, played a large part in motivating me…toward a life in politics and government for myself.  The result is perhaps a ‘new’ resolve…my resolve to dedicate my own life, as [he] dedicated his, to what is not always the easiest but what will surely be the most rewarding for me…a life of devoted public service to my country.  If I can, I will pursue legal studies for the next three years to prepare me [or else immediately devote] myself to the ideals of hard work and sacrifice in the public interest.” 

I’d grown up in an era when political assassinations happened only in “banana republics.”  Seeing a young, vital, and inspiring political leader like JFK cruelly shot down changed forever my view of America as a place where political transitions always occur peacefully.  The later assassinations of other American leaders (like Martin Luther King, Jr., and RFK) further traumatized me and others in our country. 

But although I lost him as our president, JFK had motivated me to pursue the study of American politics as well as the study of law.  At a pivotal moment, I chose to leave academia with the goal of becoming an activist via the study of law.  

After graduating from law school, I did become an activist.  I was in the vanguard of lawyers who fought to secure women’s reproductive rights.  My co-counsel and I won a hard-fought victory, invalidating the restrictive Illinois abortion statute in 1971 (Doe v. Scott, 321 F.Supp. 1385 (N.D.Ill. 1971).  As part of that lawsuit, I represented a Black teenage rape victim, winning a TRO in the appellate court that enabled her to have a legal abortion in March 1970.  This lawsuit is the focus of my forthcoming book, tentatively titled On the Barricades.

Throughout my life and my varied career, I’ve maintained my enormous interest in politics, government, and law.  Although I now view myself primarily as a writer, I continue to enthusiastically follow all of that today, whether current trends align with my personal views or not.

I will forever be indebted to JFK for inspiring me to follow this path.

Polls 2024

Are you fed up with polls?  I am.

I’m mad that every time I turn on TV news, I’m confronted with one poll after another.  The media seem obsessed with them, perhaps because they’re desperately trying to come up with fresh stories to fill their nearly endless need to offer viewers something new and different.

I’ve always been leery of polls.  First, I’ve never been asked to be in any poll, and no one I know has either. I’ve always wondered exactly who are the people answering questions in these polls.  Currently, polls seem to be focusing on voters in “swing states” and voters in one demographic group or another.  Maybe I don’t fit into any of those categories.  But I think my views on any number of issues are valuable and should somehow be included in these polls.  Why aren’t they?

Further, I’m quite certain that the people who do participate are often led to answer questions in a given way, thanks to questions that are, in my view, slanted in one direction or another. You’ve probably noticed that, too.

Instead of getting mad, maybe I should take the advice of Ezra Klein, an opinion writer for The New York Times.  On October 13, he published a column, “Ignore the Polls.”  He makes a bunch of good points.  To begin with, he notes that you’re probably looking at polls to know who’ll win.  But the polls can’t tell you that.  On average, polls in every presidential election between 1969 and 2012 were off by two percentage points.  More recent polls, in 2016 and 2020, were off by even more.  Klein states that pollsters today are desperate to avoid the mistakes they made in 2016 and 2020, when they undercounted Trump supporters.  So some of them are asking voters to recall who they voted for in 2020 and then using that info to include Trump voters more accurately this time.  But the results are very different when pollsters don’t ask voters to recall what they did in the past.  According to Klein, voters are “notoriously bad at recalling past votes.”  So why do the pollsters even bother asking?

Klein adds that polls are “remarkably, eerily stable.”  Events keep happening (like assassination attempts and televised debates), but the polls haven’t really changed.  So Klein advises us to give ourselves a break.  “Step off the emotional roller coaster.  If you want to do something to affect the election, donate money or time in a swing state…or volunteer in a local race.  Call anyone in your life who might actually be undecided or might not be registered to vote or might not make it to the polls.  And then let it go.” 

That’s exactly what I’ve been doing.  I’m glad my outlook resembles Ezra Klein’s.  Now if the media would just pay attention to his wise advice.  Hey, media people, ignore the polls.  Instead, seek out interesting stories about the candidates, the voters, and the issues.  Then let it go

Cynicism can be bad for your health

Hey, it’s easy to be cynical these days. 

We’re faced with lie-spouting politicians threatening democratic rule in our country.  We’re confronted by incompetent jerks who make countless mistakes, or even try to scam us, at almost any business we patronize.  And I can’t forget the maniac drivers who weave from lane to lane at illegally high speeds, threatening to kill us every time we’re near them on the freeway. 

But hold on a minute.  A social scientist/author says that having a cynical worldview isn’t such a great idea.  Jamil Zaki wants you to know that having a cynical worldview may have a negative effect on your health. 

In his new book, “Hope for Cynics:  The Surprising Science of Human Goodness,” Zaki concedes that being a cynic can make us feel safer and smarter than the selfish, greedy, and dishonest people in our midst.  But his research at Stanford (where he’s the director of its Social Neuroscience Lab) suggests that it’s much better to become “hopeful skeptics.”  In other words, it’s okay to be critical of troublemakers, but you should also recognize how kind and generous most people really are.

What’s at play here?  Well, we tend to pay more attention to negative events than to positive ones.  This “negativity bias” leads you to remember an occasional driver who cuts you off in traffic while you ignore the countless drivers who are obeying the rules of the road.  Zaki says we should take 15 minutes out of our day and pay attention to the kindness all around us instead of the rudeness you encounter now and then.

Similarly, he recommends that we spread “positive gossip,” pointing out good deeds and kind behavior instead of doing the opposite–spreading mean-spirited gossip about people we dislike.

What’s the benefit of avoiding cynical thought?  You’ll probably feel better about humankind, and that will probably lead to better health.  According to Zaki, the cynical among us are more likely to suffer from depression, heart disease, and feeling burned out.

In the midst of a heated campaign for mayor in San Francisco, one candidate has asked voters to end “the era of cynicism.”  He’s a political novice who has spent much of his personal fortune on philanthropic efforts aimed at improving life in our city, and he’s angry that his opponents have belittled those efforts.  I don’t blame him one bit.  Even though his philanthropy hasn’t always met its goals, the other candidates shouldn’t stoop to cynical bashing.  Instead of criticizing him (as they did in a recent televised debate), they could be praising his attempts to make life better. They could adopt a positive approach and advocate their own ideas for achieving worthwhile goals for our city.  Sadly, the negativity hurled during the debate was so awful that I immediately stopped watching.

As The New York Times book review of Zaki’s book has warned: “Don’t Fall into the ‘Cynicism Trap.”  I don’t plan to, and I hope you won’t either.  Let’s aim for hopeful skepticism.  If we avoid cynicism and instead pay more attention to the kindness around us, we just might feel better.

Declare your independence: Those high heels are killers

HAPPY JULY!  Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. I’ve revised this post for 2024. My newly revised post follows:


I’ve long maintained that high heels are killers.  I never used that term
literally, of course.  I merely viewed high-heeled shoes as distinctly
uncomfortable and an outrageous concession to the dictates of fashion that can
lead to both pain and permanent damage to a woman’s body. Several years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in
Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they
were in high heels and [had] a hard time getting away quickly.” 

During the past few years, largely dominated by the global pandemic, many women and
men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  Many women also abandoned wearing high heels.  Staying close to home, wearing comfortable clothes, they saw
no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women sought out venues requiring any sort of fancy clothes or footwear.  But when the pandemic began to loosen its grip, some women were tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing was noted in publications like The New York Times and The Wall Street Journal.  According to the Times, some were seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay. I get it.  “Dressing up” may be your thing after a few years of relying on
sweatpants.  But “towering heels”?  They may look beautiful

BUT don’t do it!  Please take my advice and don’t return to wearing the kind
of shoes that will hobble you once again. Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could
afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  Never mind that my feet were
encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store. 

But during one wearing of those heels, the pain became so great that I removed them
and walked in stocking feet the rest of my way home.  After that painful lesson, I abandoned
three-inch high-heeled shoes and resorted to wearing lower ones.  
Sure, I couldn’t flaunt my shapely legs quite as effectively, but I
nevertheless managed to secure ample male attention.  Instead of conforming
to the modern-day equivalent of Chinese foot-binding, I successfully and
happily fended off the back pain, foot pain, bunions, and corns that my
fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a troubling trend toward higher and higher heels.  I was baffled by women, especially young women, who bought
into the mindset that they had to follow the dictates of fashion and the need
to look “sexy” by wearing extremely high heels.  Watching TV, I’d see too
many women wearing stilettos that forced them into the ungainly walk I briefly
sported so long ago.  Women on late-night TV shows who were otherwise
smartly attired and often very smart (in the other sense of the word) wore
ridiculously high heels that forced them to greet their hosts with that same
ungainly walk.  Some appeared to be almost on the verge of toppling
over. Sadly, this phenomenon has reappeared. 

Otherwise enlightened women are once again appearing on TV wearing absurdly high heels.  Even one of my
favorite TV journalists, Stephanie Ruhle, has appeared on her “11th Hour” program on MSNBC in stilettos. 
C’mon, Steph!  Don’t chip away at my respect for you.  Dump those stilettos!

What about the women, like me, who adopted lower-heeled shoes instead of following
fashion?  I think we’re much smarter and much less likely to fall on our
faces.  One very smart woman who’s still a fashion icon agreed with us
long ago: the late Hollywood film star Audrey Hepburn. Audrey dressed smartly,
in both senses of the word. I recently watched her 1963 smash film Charade for the
tenth or twelfth time. I once again noted how elegant she appeared in her
Givenchy wardrobe and her–yes–low heels. Audrey was well known for wearing
comfortable low heels in her private life as well as in her films.  In Charade, paired with
Cary Grant, another ultra-classy human being, she’s seen running up and down
countless stairs in Paris Metro stations, chased by Cary Grant not only on
those stairs but also through the streets of Paris. She couldn’t have possibly
done all that frantic running in high heels!

Foot-care professionals have soundly supported my view.   According to the
American Podiatric Medical Association, a heel that’s more than 2 or 3 inches
makes comfort just about impossible.  Why?  Because a 3-inch heel
creates seven times more stress than a 1-inch heel. A noted podiatrist and foot and ankle surgeon has explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to
“ball-of-the-foot numbness.” (Yikes!)  He advised against wearing
3-inch heels and pointed out that celebrities wear them for only a short time,
not all day.  To ensure a truly comfortable shoe, he added, no one should
go above a 1.5-inch heel. 

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of
America’s major shoe-sellers, began to promote lower-heeled styles. Although
stilettos hadn’t disappeared from the scene, they weren’t the only
choices.  I was encouraged because Nordstrom is a bellwether in the
fashion world, and its choices can influence shoe-seekers.  Then the
pandemic arrived and changed shoe-purchasing.  During the first year,
sales of high heels languished, “teetering on the edge of extinction,”
according to the Times.  But because the pandemic has now dissipated to a large extent, some women may have resurrected the high heels already in their closets.  They may even be inspired to buy
new ones.  I hope they don’t.

There is heartening news from bellwether Nordstrom.  In a recent catalog, two
pages featured nothing but sneakers.  Other pages displayed nothing but
flat-heeled shoes and “modern loafers.” Stilettos were nowhere to be seen.

Let’s not forget the Gen Z generation.  Most Gen Z shoppers don’t follow the
dictates of fashion. They largely eschew high heels, choosing pricey and often
glamorous sneakers instead–even with dressy prom dresses.

My own current faves: I wear black Skechers almost everywhere (I own more than one
pair). I occasionally choose my old standby, Reeboks, for serious walking. (In
my novel Red Diana, protagonist Karen Clark laces on her Reeboks for a lengthy jaunt, just as I do.)  I
recently bought a pair of Ryka sneakers–so far so good. And in warm weather, I
wear walking sandals, like those sold by Clarks, Teva, and Ecco.

Any women who are pondering buying high-heeled shoes should hesitate.  Beyond
the issue of comfort and damage to your feet, please remember that high heels
present a far more serious problem.  As the deaths in
Riverside demonstrate, women who wear high heels may be putting their lives
at risk.
  When they need to flee a dangerous situation, high heels can
handicap their ability to escape. How many needless deaths have
resulted from hobbled feet?
 

The Fourth of July is fast approaching.  As we celebrate the holiday this
year, I once again urge the women of America to declare their independence from high-heeled shoes. If
you’re thinking about returning to painful footwear, think again.  You’d be wise to reconsider.I urge you to bravely gather any high heels you’ve been clinging to and throw
those shoes away.  At the very least, keep them out of sight in the back
of your closet.  And don’t even think about buying new ones.  Shod
yourself instead in shoes that allow you to walk in comfort—and if need
be, to run. Your wretched appendages, yearning to be free, will be forever grateful.



 



Does anyone still iron?

I like to stay abreast of the news.  To get current info, I’ve purchased online subscriptions to newspapers like The Washington Post.

The Post is currently in a state of upheaval, reeling from revelations about editors and others who run the paper.  But it’s still churning out plenty of news stories.  I like to peruse the list of stories every day, choosing the ones I want to read and deleting the rest.

One of the things I don’t look for in the Post is advice on how to do household chores. I’ve never been much of a fanatic about housekeeping.  Probably a predictable reaction to my mother’s
obsession with it.  I do keep things clean, and I can usually track down what I need.  My sister, on the other hand, happily followed my mother’s path.  When she listed her house for sale, the listing described her place as “impeccably maintained.”  I couldn’t help joking that my house was, by contrast, “peccably maintained.”  (I don’t think that’s a real word.)

Noting the chaos going on behind the scenes at the Post, I was recently astounded to come across advice on “how to iron better and faster.”  A long column set out “ironing tips and tricks.”

My reaction?  I don’t iron!  I haven’t ironed anything in years.  Does anyone still iron?

After moving to San Francisco 19 years ago, leaving behind all sorts of things I used in my former home, I purchased a new steam iron at Macy’s.  My apartment has a built-in ironing board, and I guess I expected to use this iron someday. But I never have.  It languishes in its box, resting on a shelf, eagerly waiting to confront some wrinkled clothes.  The funny thing is that I never wear clothes that need ironing.

I can’t help remembering how my mother was addicted to an old-fashioned heavy non-steam iron she must have acquired in the 1950s.  When I finally made a little money in my newly-launched legal career, I gave her the gift of a brand-new steam iron purchased at the old Sears department store on State Street in downtown Chicago.  Incredibly, she forced me to return it!  To my amazement, she clearly preferred her long-established habit of wielding that heavy iron and watching all manner of
clothes turn wrinkle-free under her watch. 

I confess that in my long-ago past, I did iron a few items of clothing.  I specifically recall ironing
the white cotton blouses we all wore back then. But I happily left cotton blouses behind years ago.  When I worked as a lawyer, I sometimes wore silk blouses that needed special care, but I sent
those to a cleaners rather than tackle them myself.

Today my wardrobe is filled with very little besides t-shirts and jeans and black pants that don’t need ironing.  I can’t imagine standing in front of an ironing board handling a dangerously hot appliance that does nothing more than remove wrinkles. I view that as a tremendous waste of my time.

If you choose to wear clothes that need ironing, I certainly respect your choice.  You must prefer to wear
clothes very different from mine.  Maybe you’d like to read the advice by Post columnist Helen Carefoot.  In her column, she covers topics like “how to use the ironing board correctly.”  Good luck to you in that pursuit! 

Please forgive me if I don’t join you.  I choose to avoid that particular pursuit.  I’m quite fulfilled wearing my t-shirts and non-ironed pants while I take a hike outdoors, watch the latest must-see on TV, or sit in front of my desktop computer, happily typing away.



 



“A Raisin in the Sun”

The enduring acclaim for the play “A Raisin in the Sun,” as well as its film version, has inspired me to relate what happened when I saw the play for the very first time. 

During 1959, this stunning new play about a Black family in Chicago, written by the exciting young playwright Lorraine Hansberry, premiered at an upscale downtown Chicago theater, the Blackstone Theatre.  Although histories of the play often state that it had its premiere on Broadway in New York City, it actually appeared earlier in Chicago.

The sometimes-caustic theater critic for the Chicago Tribune, Claudia Cassidy, wrote an enthusiastic review of it on February 11, 1959, noting that it was “a remarkable new play” that was “still in tryout.”

“Raisin” represented an enormous theatrical leap because of its plot– a realistic portrayal of a Black family in Chicago confronted with a crucial decision–and because of the brilliant performances by its actors, including Sidney Poitier and Ruby Dee.

I was lucky to see “Raisin” during its pre-Broadway stay in Chicago.  As a Chicago public high-school student with limited funds, I saw it as an usher.

Ushering was a fairly casual affair in those days.  Often accompanied by a friend or two, I would simply show up at a theater about an hour before the curtain went up and ask the usher-captain whether she could use another usher.  The answer was invariably “yes,” and I would be assigned to a designated area in the theater where I would check tickets and seat ticket-holders. Ushering enabled me to see a great many plays and musicals at no cost whatsoever, and I ushered as often as my school’s schedule allowed.

I’ve never forgotten the startling incident that occurred during the matinee performance of “Raisin” I viewed as an usher.  In the midst of the performance, for no apparent reason, the actors suddenly stopped speaking.  The reason became clear when the theater manager strode onto the stage.  Bottling his rage, he explained that the actors had been struck by items thrown at the stage by patrons in the theater. 

I was shocked to learn of this extremely disrespectful behavior.  I’d never witnessed a problem of any kind created by audience members.

I concluded (fairly, in my opinion) that the audience must have included a number of boorish high-school students sitting in the balcony that afternoon thanks to “comp” tickets.  Some of them were undoubtedly displaying the bigoted attitude toward Black people that prevailed in their homes.

The Chicago area’s population at that time included large numbers of white people who were biased against Blacks.  Some of these whites felt threatened by any possibility of change in their communities.  Some later openly demonstrated to protest Dr. Martin Luther King Jr’s visit to Chicago. 

Here, in an upscale downtown theater, was the ugly and ignorant result of this bias.

Has anything changed since 1959?  For a long time, I thought it had.  During my years as a public interest lawyer and, later, as a law school professor and writer, I worked toward and believed in meaningful progress in the area of civil rights.  I had hoped that this feeling by some white people that they were threatened by Blacks–and eventually by Browns as well—had decreased.

Sadly, our recent history has revealed that this feeling still exists. It’s even been encouraged by certain “leaders’ in the political arena.  Some predict that violence could be the ultimate outcome.

I worry that we’re edging toward a return to the ethos of 1959 and the hostility displayed during the performance of “A Raisin in the Sun” I saw back then.  I fervently hope that this will not, indeed cannot, happen and that most Americans vehemently reject the prospect that it will.

Fighting for a legal abortion in March 1970–and winning

In the aftermath of the Supreme Court’s dismantling of Roe v. Wade, we’ve all witnessed one anti-women’s rights assault after another.  There was, last week, a glimmer of hope in the abysmal state that is current-day Texas when a trial court judge issued a TRO allowing a pregnant woman to obtain a medically-needed abortion.

A TRO is a temporary restraining order, issued by a court, upholding the right of a plaintiff to obtain the remedy she needs right away to avoid irreparable injury to her. In the Texas case, the plaintiff was an expectant mother who very much wanted to give birth to a healthy child, but medical professionals had sadly concluded that her fetus would not survive and her own health and future fertility could be irreparably damaged.

In my view, the TRO was justified and the trial court reached the right decision.  But the Texas state attorney general intervened to stand in the way, and the Texas Supreme Court supported his position.  The result:  The plaintiff left the state of Texas to obtain the abortion she needed.

This appalling state of affairs reminded me of what happened in Chicago over 50 years ago.  I was working as a young Legal Aid lawyer in Chicago, co-counsel in a lawsuit filed in U.S. District Court on February 20,1970, that challenged the constitutionality of the Illinois abortion statute,

I suddenly acquired a new client in March 1970 when I got a phone call from one of our Legal Aid branch offices.  The mother of a teenage rape victim had come into that office to report that her daughter had been raped and was now pregnant.  The mother asked whether we could do anything to help her daughter get a legal abortion.

This Black teenage girl, whom I dubbed Mary Poe, had been beaten and raped by two boys in her neighborhood, and her resulting pregnancy had been confirmed by a local physician.  I was already representing two other women, adult women we called Jane Doe and Sally Roe, but this young woman was different. She was a brutalized 16-year-old victim of rape, and her mother didn’t want her to be forced to bear the result of the rape.

I immediately began preparing documents to allow this Black teenager to intervene as a plaintiff in our case. On March 19, I filed these documents on behalf of Mary Poe, seeking to obtain “a legal, medically safe abortion,” denied at this time because her doctor had “advised her that under the language of the challenged statute” he could not “perform such an operation upon her without fear of prosecution.” 

The new Complaint joined the original plaintiffs’ prayer for relief and added the request that the court “enter a temporary restraining order [TRO] enjoining the defendants from prosecuting [one of our plaintiff physicians, Dr. Charles Fields] under the challenged statute if he terminates her current pregnancy on or before March 27,1970.  Unless this relief is granted by the court, this plaintiff will suffer irreparable injury.”  Dr. Fields had examined Mary Poe and concluded that her pregnancy could be safely terminated until on or about March 27.

The district judge presiding over our case, William J. Campbell, was on vacation, and we turned to another district judge, Edwin Robson, who was reviewing documents in Campbell’s absence.  So on March 23, I filed a motion for leave to intervene on behalf of Mary Poe and for a TRO allowing her to receive a legal abortion.  Robson ordered the defendants to file briefs by March 26 and set our motion for ruling on March 27.  On that date, the last day Dr. Fields said the pregnancy could be safely terminated, Robson finally granted the motion for leave to intervene, but he denied our motion for a TRO.  He continued that motion until Campbell’s return in April.

Back in my office, I prepared Mary Poe’s appeal to the U.S. Court of Appeals for the Seventh Circuit, which sat in a courtroom several floors above the district court courtrooms.  As soon as the appellate court allowed me to, I argued before Judge Luther Swygert, chief judge of the appellate court, appealing the Robson ruling that denied Mary Poe a legal abortion.

Judge Swygert ruled on March 30:  “[T]his matter comes before the court on the emergency motion of [Mary Poe].  Upon consideration of the motion…IT IS ORDERED that a temporary restraining order be entered enjoining defendants…from prosecuting plaintiff [Dr. Fields] under [the Illinois statute we were challenging], if he terminates the current pregnancy of [Mary Poe].”

I remember standing in the courtroom to hear this order spoken out loud by Judge Swygert, a brilliant and fair-minded judge.  He became my enduring judicial hero ten months later, when he issued the ruling upholding our constitutional challenge, in January 1971.

We’d won a TRO allowing Mary Poe to get a legal abortion!

When Judge Campbell returned to his courtroom in April, he was confronted with the appellate court’s decision, and there was no way he could change it.  But he went on to oppose us at every possible turn as we proceeded with our lawsuit.  I describe everything that happened in my forthcoming book, which I’m hoping will appear in print in 2024.

In the meantime, I’ll state my unwavering belief that Campbell was an early version of the “robed zealots, driven by religious doctrine, with no accountability,” described by Maureen Dowd in her opinion column in The New York Times on December 16th.  In this column, “Supreme Contempt for Women,” Dowd clearly indicts “the Savonarola wing of the Supreme Court,” who couldn’t wait “to throw [Roe v. Wade] in the constitutional rights rubbish bin.”  Judge Campbell would have fit right in.

Quote: “Congratulations on your life!”

       

Are you a fan of Broadway musicals?  I cheerfully admit that I am. Thanks to my parents, I’ve been an enthusiastic fan since my early childhood.  I must have been only 5 or 6 when our family began heading to “summer stock” in the suburbs north of Chicago.  Where a shopping mall now sits, musical productions introduced me to the excitement of live performances combining music, lyrics, and dialogue.  The most memorable was a production of “Song of Norway,” a musical that opened on Broadway in 1944. It features songs with lyrics set to the haunting music of Edvard Grieg.  Those songs have stayed with me my whole life, and as a bonus, I became a great admirer of Grieg’s music.

My parents first introduced me to a genuine theater experience when we watched “South Pacific” at the Shubert Theater in downtown Chicago when I was only 9.  In the leading role of Nellie Forbush, starring Mary Martin on Broadway, was Janet Blair, an American actress and singer who played this part for three years in a touring production that popped up in venues all across the country.  Chicago was one of the first.  We bought the album and played the songs over and over.  Rodgers and Hammerstein won my heart right then and there.

Around the time I turned 12, while my family was still living in Chicago, my parents treated us to a production of “Oklahoma!” that I’ve never forgotten.  Guess who played Laurey?  Relative unknown Florence Henderson, later of TV fame, who was so good that I made a point of remembering her name. We also saw a memorable performance by the revered D’Oyly Carte Opera Company, featuring energetic Brits in Gilbert and Sullivan’s “H.M.S. Pinafore” and “Trial by Jury.”

When our family moved to LA, and my father died later that year, my attendance at musicals stopped short.  But after I returned to Chicago as a teenager, my fascination with Broadway shows revived. Touring companies kept coming to Chicago, and I discovered that I could either attend them with a paid ticket or attend free by becoming an usher. 

At that time, ushering was a very casual affair.  I could just show up, usually with a friend, and volunteer to usher.  I’d be directed to the woman in charge, who would always say “Yes” and find a spot for me somewhere in the theater, where I would check tickets and seat patrons before finding a seat for myself.  In this way I saw a lot of Broadway shows, both musical and purely dramatic, during the late 1950s and throughout the ‘60s. 

I could of course sometimes pay my own way with my babysitting earnings, and buying tickets became a gift sometimes bestowed by my mother.  In this way, I saw “West Side Story” on stage at the Erlanger Theater (later demolished to make way for the Daley Center), and, to use current parlance, I was blown away by its drama, music, and choreography.  I’d already attended quite a few Broadway shows by that time, but I’d never seen anything like it.

Other memorable musicals I saw during those years included “My Fair Lady,” “The Pajama Game,” “The Music Man,” and “The Most Happy Fella.”  Film actor Forrest Tucker was formidable as music man Professor Harold Hill in his touring production (it ran for 58 weeks at the Shubert Theater).  I bought the LP recordings of all of them and played them over and over on my small Webcor phonograph, trying to learn the lyrics.  (I saw many dramas during these years as well, but those aren’t within the scope of this post.)

During a brief visit to New York City in 1967, my mother and I saw an exciting performance of the original production of “Mame,” starring Angela Lansbury as Mame and Bea Arthur as Vera Charles.  I’ll never forget watching these two phenomenal women dancing together, arm in arm, while they sang “Bosom Buddies.”

Before I changed my life and moved to LA in 1970, I saw a few more Broadway hits in Chicago, including “Man of La Mancha,” “Fiddler on the Roof,” “Camelot,” “Hello, Dolly!” and “Bye Bye Birdie.”

I met and married my marvelous husband (I’ll call him Marv) in LA in 1971.  We shared a great deal, including a love of the theater. During the year we lived in LA, we saw a lot (including a play featuring screen-legend Henry Fonda as Abraham Lincoln).  Remarkable productions of Broadway musicals were, notably, “Company” with an exciting cast and “Knickerbocker Holiday” starring Burt Lancaster.  Lancaster, not known for his singing, wrote in his playbill blurb that he’d learned how to sing from his friend Frank Sinatra.  

Fast-forward 15 years. Marv and I saw countless plays and musicals while we lived in Ann Arbor, La Jolla, and Chicago.  (We also saw the original production of “Grease” during a brief stay in NYC in 1973. That’s a story for another day.)  

But I’ll zoom ahead to London in March 1986.  My sister had visited London shortly before Marv and I decided to travel there that March.  Although I didn’t always take my sister’s advice, this telephone call was different. She enthusiastically praised a new musical production in London called “Les Misérables.”  Based on the Victor Hugo novel, the story is set in 19th-century France, where Jean Valjean is arrested for stealing a loaf of bread. It follows him after he’s released from prison and goes on to lead an admirable life while at the same time he’s relentlessly pursued by a ruthless police inspector, Javert. 

Sis had seen this new musical, and she couldn’t praise it enough. “I know it’s expensive,” she said, “but it’s worth it!”  Marv was earning peanuts as a math professor, I was “between [my poorly-paid part-time] jobs,” and when I checked, the tickets were $75 each, a real stretch for us.  But because of our love of the theater, and because we’d already seen many plays and musicals in London (beginning in 1972) and never been disappointed, we plunged ahead and ordered those pricey tickets.

You’ve probably guessed what happened next.  We witnessed the original production of “Les Misérables,” transplanted from a smaller theater to the enormous Palace Theatre because of its gigantic success.  Once we heard the very first notes of the overture, introducing the astounding performance we were about to watch, we were enthralled by the phenomenon that has become “Les Mis.”

We were especially enthralled by the astonishing performance of one man:  Colm Wilkinson, inhabiting the leading role of Jean Valjean. Wilkinson, a 42-year-old Irish tenor and actor, gained worldwide fame when he originated this powerful role, first in London and later in New York.  His rendering of the song “Bring Him Home” made the hair on the back of my neck stand up.  I’d never heard a performance like his in any Broadway musical I’d seen. 

The entire production, including songs like “I Dreamed a Dream,” “On My Own,” and “Master of the House,” was memorable enough to last a lifetime, and it still thrills me today.  My love affair with “Les Mis” led me to see it twice more back in Chicago, taking my young daughters to witness it with me.

Fast-forward one more time:  San Francisco in 2023.  After moving to SF in 2005, I saw great Broadway hits like “Wicked,” ‘In the Heights,” “The Book of Mormon,” “Something Rotten!” and “Hamilton.”  My younger daughter (another theater-lover) and I joyously went to most of these together.  A year before the pandemic hit, my older daughter (M) asked me to join her to see “Cats” in San Jose in 2019.  Please don’t laugh.  It was incredibly good!  (It’s unfortunate that the ill-conceived film version has besmirched a great musical that, if done well, should be seen and heard live.) 

But the pandemic sadly put a halt to my attending live theater performances.

M has a love of “Les Mis” much like mine, and she knows the history embedded in it (she earned a summa cum laude in French literature and history at Harvard).  When a touring company announced that it would appear in San Francisco this year, M knew she wanted to see it again. 

Soon I was invited to join M, her husband, and her daughters at a performance in late July, and I jumped at the chance.  The pandemic had lessened its grip, and I promised my younger daughter I’d wear a mask throughout the performance.

So I was thrilled last month to see “Les Mis” for the fourth time, about two decades after the second time in Chicago.  The production was exciting, and all five of us loved it.  Midway through, I began thinking about the man who had inhabited the role of Jean Valjean in my first go-round and searched my memory for his name. “Colm,” was it?  During intermission, I glanced at my phone and searched for both Colm and “Les Misérables” in 1986, and I came up with it: Colm Wilkinson.

As we left the theater, I began to tell my family how I’d seen the original Jean Valjean in London, Colm Wilkinson, and just how wonderful he was.  As we approached our parking structure, a woman walking near me must have overheard and looked at me in disbelief.  Clearly a knowledgeable fan of “Les Mis,” she skeptically asked, “You saw Colm Wilkinson in London?”  “Yes,” I replied, nodding.  “My husband and I saw Colm Wilkinson in London in 1986.”  This woman (I’ll call her W) repeated, with emphasis, “You saw Colm Wilkinson in London in 1986?”  I nodded again.  Startled and amazed, W felt the need to say something.  She blurted out:  Congratulations on your life!”  I smiled and nodded again, thanking her for her stunning turn of phrase.

I was indeed stunned by this phrase, one spoken by a complete stranger.  On reflection, I want to thank W for saying that I should be congratulated for my life.  In many ways, I have indeed had a remarkable life.  Watching Colm Wilkinson as Jean Valjean in London in 1986 constitutes just a tiny part of it.  It was an astounding performance, and I’ll always remember it.  I was extremely lucky to see him that night.  But all that Marv and I did was buy our tickets and sit in the audience, thrilled by his performance.

I honestly hope that the whole scope of my life—what I’ve done to effect positive change for our planet, to sponsor worthy political outcomes, to help people in need, to work for equal rights for all Americans, to be a good wife, mother, and grandmother–in short, to live the kind of life I’ve always tried to live–is what truly deserves, on balance, a small measure of congratulations. 

What about cashmere?

To begin, let’s define “cashmere.”

The 1985 edition of The American Heritage Dictionary states simply:

  1. Fine, downy wool growing beneath the outer hair of the Cashmere goat.  2.  A soft fabric made of wool from the Cashmere goat or of similar fibers. [After Kashmir, a region in India.]

The Cashmere goat is described as a goat “native to the Himalayan regions of India and Tibet, and prized for its wool.”

We can probably find a lengthier, more recent, description in Wikipedia, but the old definition is just fine.

Now, let’s consider the disturbing role that cashmere sweaters played during my high school years.

I attended a Chicago public high school decades ago.  My school was filled with a wide variety of students stemming from a number of different ethnic groups. Some of my fellow students were aspirational and willing to work hard to achieve success both academically and socially.  In many ways it was an inspiring environment.  Unfortunately, however, a bunch of cliques held sway, dubbing each student “popular” or not.

I was generally viewed as one of the popular kids.  I was a member of the most desirable social clubs, I was elected to class office—twice–and I was chosen by Mrs. Keats to join the mixed chorus.  (Mrs. Keats admitted you to the mixed chorus only if you were either a great singer or you were a good-enough singer who was also popular. I fell into the latter group.)  So I was spared the worst treatment doled out by the cliques.

But it was an evil system, allowing the social clubs to blackball potential members and do countless other destructive things.

One of the most destructive focused on the clothes we wore. I have no knowledge of the boys’ clothing choices.  But I do remember that most of the girls were eager to acquire what they viewed as fashionable clothes. Often these were pricey, and not everyone could afford them.

Chief among the clothes in this category were cashmere sweaters and, frequently, matching woolen skirts. (Yes, girls were required to wear skirts to school in that benighted era, even when Chicago temperatures dipped below-zero during our frigid winters.  This was one more example of gender-inequity.) 

Emphasis on cashmere was particularly noticeable.  When gift packages were opened at birthday parties, the cry would go up:  “Cashmere!  Cashmere!”

After my father died when I was 12, my family of three lived on a modest income, no longer supported by the breadwinner my father had been.  I became quite frugal, choosing not to add to my mother’s budget problems.  But, ironically, because my mother’s family owned a women’s apparel store, I was able to wear clothes not terribly different from my friends’.  I simply had fewer of them. 

My mother, raised in her family’s retail business and now working part-time in their store, thought it was important to wear the right clothes for every occasion.  So I wasn’t completely shut out of the cashmere game, and I owned one or two cashmere sweaters. But what about the girls who couldn’t afford to buy them?

Once I began working and had my own disposable income, I sometimes added a new cashmere sweater to my wardrobe.  Truthfully, cashmere can be soft and warm, making it desirable in cold climates.  But I stopped buying new cashmere sweaters years ago.  Please read on….

Let’s look at the way cashmere is promoted.

In March 2023, I wondered just how cashmere sweaters are currently bought and sold.  Knowing what I do now (see below), I wouldn’t consider buying a new one for myself.  But cashmere sweaters are readily available.

Checking the website for one store in the mid-price-to-upscale category, Nordstrom, I discovered the following:

Hundreds of cashmere sweaters were listed on the website in a wide range of prices, beginning at about $100.  Among the highest prices I came across were a Balmain brand for $1,995 and a Loro Piana brand for $2,050.  Some of the sweaters had reduced prices as winter sweater-wearing weather wound down.

Nordstrom partially redeems itself by having a policy called “Sustainable Style,” in which at least 30% of an item is made up of “sustainably sourced” materials.  A few sweaters include “recycled cashmere.”  Without doing further research, I assume that the store is aware of the price we pay for luxury goods, not merely in dollars but also in the harm they can cause to the environment.  (Of course, “fast fashion,” which is generally cheap, is also harmful.  But that’s a story for another day.)

Patagonia is a high-quality retailer featuring outdoor clothing.  With a history of concern for the environment, it has recognized the harm inherent in cashmere and has stated the following on its website:  “We use recycled cashmere (blended with 5% virgin wool) because of its soft, lightweight warmth.”  The website says a lot more, which I’ve added below.

Now let’s consider something I’ve been hinting at:  The harm done to the environment by the production of cashmere.  I’ll hazard a guess that most of you are totally unaware of this harm.

In her 2019 book, inconspicuous consumption: the environmental impact you don’t know you have, Tatiana Schlossberg devotes a chapter to “the yarn that makes a desert.”  This chapter is a cleverly written discussion of the worldwide demand for cashmere, along with the destructive path the breeding of cashmere goats has caused.

Schlossberg focuses on Mongolia and its Gobi desert, where nomadic herders have been shepherding their cashmere goats for thousands of years.  These goats have “some of the world’s warmest, softest hair,” which “used to be considered a true luxury item.”  Unfortunately, the goats also damage the soil, harming the plants they eat and changing grassland to desert.  The result is, according to Schlossberg, that 90 percent of Mongolia is at risk of becoming desert unless management practices change.  Climate change plays a part, but “right now, the goats are directly implicated.”

Increasing demand for cashmere has driven down the price, so the herders breed more goats, and the supply of high-quality cashmere has shrunk.  “To be sure, there is still a lot of really expensive cashmere out there,” but there is notably a desire for cheap cashmere, particularly in the US.  Schlossberg makes clear that it’s not the fault of the consumer that some cashmere is now cheap, and “it’s not wrong to want nice things or to buy them, sometimes.”  But…”we can’t all have unlimited amounts of cashmere if we want to live in a world that isn’t spinning into a desert, in order to keep us swathed in cashmere at cheap prices because that’s what we’ve decided is important.” 

She adds:  “It’s all part of the same problem, and it’s not just cashmere.  It’s everything we wear and how we use it.”  [Please see my August 2022 blog post on the issues related to cotton, “Totin’ Cotton,” https://susanjustwrites.com/2022/08/18/totin-cotton/.%5D

Now let’s look at the Patagonia website, which confirms everything Schlossberg has written:

“In the 1960s and ’70s, cashmere was used as a luxury material for overcoats, suits and sweaters. As people became familiar with its soft, warm feel, demand for the material grew. Today, cashmere is widely used throughout the industry as a commodity fiber, which is leading to the overbreeding of cashmere goats, a decrease in fiber quality and the desertification of the Mongolian region where the vast majority of cashmere goats live.

“Patagonia uses high-quality recycled cashmere to buck this trend and reduce the environmental cost.  We started using recycled cashmere in 2017 after reviewing our supply chain and noticing an increase in the overgrazing of cashmere goats in Mongolia. Today, we collect pre-consumer scraps from European factories and send them to a sorting facility where they are meticulously sorted by color and then put in large machines that shred the fiber. We blend those fibers with 5% virgin wool to create a strong, undyed yarn that we use to make sweaters, beanies, scarves and gloves.”

Patagonia adds:  “Innovations like Cashpad—a mechanical recycling program for cashmere and wool textile waste—are helping us scale up the production of recycled cashmere. In coming seasons, we hope to incorporate it into more of our products.  Together, let’s prioritize purpose over profit and protect this wondrous planet, our only home.”

By the way, even Patagonia isn’t shy about asking high prices for its cashmere sweaters.  A women’s “recycled cashmere” cardigan is listed on its website at $269.

Some concluding thoughts

It’s heartening to learn that changes are happening in the retail world.  “Recycled cashmere” may begin to diminish the harm done by breeding cashmere goats simply to add to the world’s supply of cashmere sweaters.  “Consumerism”—the desire to acquire more and more things–still rides high in our world, but we appear to be moving towards more enlightened consumerism.

Looking back on my high-school years, I clearly see that our fixation on cashmere was all wrong.  We honored the wrong values.  Instead of clamoring for cashmere sweaters and other pricey material goods, we should have set other goals for ourselves.  I believed in the value of an excellent education, and I worked hard to get one, but I should have aspired to do more than that.  “To make the world a better place,” as the current phrase goes. 

I was aware of poverty in Chicago, and I was vaguely troubled by the unequal position of minorities in the city, but I never tried to achieve change.  It wasn’t until much later that I recognized the need to do so.  Instead, during high school I bought into the desire to wear cashmere sweaters.  Although I was not among the more affluent students–those who could afford countless luxuries–I never thought about those students who couldn’t afford any of them.  

Looking back, I regret that I had such a limited outlook on the world at that time.  I like to think that I’ve lived the rest of my life in a way that has tried “to make the world a better place.”