Author Archives: susanjustwrites

All the Presidents’ Men: an update

A few weeks ago, I plucked an old movie from my TV playlist and re-watched the 1976 award-winning film, “All the Presidents’ Men.”   I found it not only the riveting film I remembered but also a remarkably relevant film to watch right now. 

In this fast-moving story of two intrepid journalists working at The Washington Post in 1972, the media world at that time gradually became aware of what became known as “Watergate.”  Although President Richard Nixon had a commanding lead in the polls and was about to be reelected in a landslide in November 1972, his sense of insecurity and inferiority led him, along with his cronies, to sponsor a break-in of Democratic Party headquarters in the Watergate office building in June 1972.  The break-in was less than totally successful.  Moronic criminal-types made a couple of foolish errors that led to the detection of the break-in and their arrest by DC police.

At The Post, the two young journalists, Bob Woodward and Carl Bernstein, faced innumerable obstacles as they tried to ferret out the truth of exactly what had happened and why.  The story ultimately focused on WHO:  Who were the players in the Nixon administration who were pulling the strings behind the Watergate break-in? 

To see the whole story play out, you may want to watch the film yourself.  But whether you watch it or not, please keep in mind just how relevant it is today.

Watergate was only one of the “dirty tricks” Nixon and his cohorts employed to undermine his political opponents.  On January 20, a president demonstrably worse than Nixon was inaugurated.  After a campaign replete with disinformation, he has already begun to effect enormous change in our country.  More than ever, we need brave and intrepid journalists like Woodward and Bernstein to ferret out the truth behind any possible wrongdoing.

The role of The Washington Post is central in both eras.  In 1972, Woodward and Bernstein had to persuade their reluctant editor at The Post to support them as they pursued the truth.  He finally relented and allowed them to publish their findings.  But if they had faltered in the face of opposition, the truth may never have come out.

In 2025, journalists at The Post have taken a different route.  A popular columnist, Jennifer Rubin, loudly spoke out against her editors and her publisher, Jeff Bezos, whom she saw as kowtowing to the incoming administration.  She and her colleagues decided to quit working at The Post, proclaiming that it was no longer seeking the truth.  On January 20, she wrote:

“The American people certainly will not be front and center at Trump’s inauguration. It’s all about him and his billionaire cronies, including the media owners who have buckled to his will. ‘Big-name billionaires are lining up to strengthen their relationships with incoming President Donald Trump during next week’s inauguration festivities,” Forbes reported.  When you add in [others] whose combined wealth dwarfs many countries’ GDP’s—you get a vivid tableau of the new oligarchy. We usher into office today a government of, by, and for the billionaires.” 

Rubin and other like-minded journalists decided to create a new entity, The Contrarian.  Norm Eisen explained how it started:

“Jen and I agreed to launch [this] venture, rounding up…over two dozen contributors in a matter of days.  We kicked off with … Jen’s Post resignation letter. While we had high hopes, we never could’ve imagined what happened next. A quarter of a million subscribers poured in … And the engagement was through the roof, with over 1,000,000 views per day.” 

Rubin proclaimed that the new venture hoped to be “a…space where independence is non-negotiable. Here, you won’t find cozy alliances, half-measures, or false equivalences. We bend the knee to no one, vigorously challenge unchecked authority, and champion transparency and accountability.  In a nation awash with noise and growing disinformation, The Contrarian cuts through the static to deliver sharp, uncompromising insights…. Our loyalty is to … the truth, and to our democratic ideals—many of which are currently under threat.”

I’ve signed up to get The Contrarian delivered to my inbox.  I hope it will stick to its commitment to the truth.  But I haven’t given up on the “legacy media”–mainstream publications like The Washington Post, The New York Times, the San Francisco Chronicle, and the San Francisco Standard.  All of them still land in my inbox every day.  (I also watch TV news programming when it appears to report the news fairly.)  I think that all of these publications include at least a few brave journalists, like the now-legendary Woodward and Bernstein, still searching for the truth, still speaking out to report wrongdoing in DC or elsewhere. 

I’ll be watching to make sure they don’t falter, hoping that, despite editors and publishers who may stand in their way, they’ll continue to live up to their role as journalists and tell their readers the truth.

Those tempting holiday treats

December means one delicious holiday treat after another.  We’re all tempted to indulge.  But before you start munching, you might want to know the results of a couple of studies related to those holiday sweets.

First, if you love chocolate, you may already be aware of the virtues of dark chocolate.  But an important new study has just confirmed that only dark chocolate is associated with lowering the risk of developing diabetes.  This 30-year-long study, conducted at the Harvard Chan School Department of Nutrition, focused on almost 200,000 people who started out free of diabetes. When the study ended, nearly 20,000 had been diagnosed with Type 2 diabetes. A lot of them reported specifically on their dark and milk chocolate intake.

It’s interesting, first of all, that those who ate at least 5 ounces of any kind of chocolate had a 10% lower risk of developing T2 diabetes than people who rarely or never ate chocolate.  But significantly, dark chocolate had a much bigger impact than milk chocolate.  Participants who ate dark chocolate had a 21% lower risk, with a 3% reduction in risk for every serving of dark chocolate eaten in a week.

At the same time, milk chocolate was NOT associated with reduced risk even though it has a similar level of calories and saturated fat.  Why?  According to the researchers, it’s the polyphenols in dark chocolate that may offset the effects of fat and sugar.

So before you bite into a mouthwatering chocolate dessert, try to find one made of dark chocolate.  I’ve been sampling some new dark chocolate candy bars, and they’re delicious.  It’s really no great hardship to switch from milk chocolate to dark.

You might also want to know about new research into one feature of the sweets we love:  their frequent dependence on high-fructose corn syrup.

Scientists at Washington University in St. Louis have found that dietary fructose promotes the tumor growth of certain cancers in animal models.  The finding in this study, published December 4 in the journal Nature, could open up new avenues for care and treatment of many types of cancer.

“The idea that you can tackle cancer with diet is intriguing,” said Gary Patti, a professor of chemistry, genetics, and medicine at the WashU School of Medicine.  The culprit seems to be fructose, which is similar to glucose.  Both are types of sugar, but the body seems to metabolize them differently.  Both are found naturally in fruits, vegetables, dairy products, and grains, and both are added as sweeteners in many processed foods. But the food industry has favored fructose because it’s sweeter. 

Consumption of fructose has escalated dramatically since the 1960s, and Patti pointed out that the number of items in your pantry that contain high-fructose corn syrup, the most common form of fructose, is “pretty astonishing.”  “Almost everything has it,” he added.  This includes foods like pasta sauce, salad dressing, and ketchup.  “Unless you actively seek to avoid it, it’s probably part of your diet.”

The problem is that fructose apparently impacts the growth of tumors.  I’ll skip the technical stuff, but what’s important is that we should avoid dietary fructose as much as we can.  While investigators at WashU Medicine and elsewhere around the world continue to look into possible connections between the surge in fructose consumption and the increasing prevalence of cancers among people under the age of 50, let’s try to avoid this problem.

Here’s my advice:  If you plan to indulge in some yummy holiday treats, try to find those made with dark chocolate and those that don’t include high-fructose corn syrup.  If you can.

Happy holidays!

JFK

Today is November 22, a day forever marked by an American tragedy.  On this day in 1963, President John F. Kennedy was assassinated in Dallas, Texas.

As a young kid, I was inspired by Kennedy’s appearance in my world when the media focused on his candidacy for the vice-presidential nomination at the 1956 Democratic Convention.  A vivid contrast to Presidents Truman and Eisenhower, he was a youthful and vigorous U.S. Senator who advocated positive changes in our country.  Along with many others in my generation, the emergence of JFK on the political scene intensified my interest in American politics.  Later that year, my sister gave me a copy of “Profiles in Courage,” Kennedy’s book about political heroes in American history.  I treasured that book and eagerly read and re-read it.  Over the years, I’ve continued to collect books about JFK.  My collection includes my original copy of “Profiles in Courage.”

After his election as President in 1960, Kennedy continued to inspire me.  And on June 11, 1963, he spoke out in favor of equal justice for all Americans.  I had returned to my home in Chicago after my college graduation at WashU in time to watch the televised speech he gave that day.

JFK began by noting Alabama Governor George Wallace‘s refusal, despite a court order, to allow the admission of two Black students to the University of Alabama.  He went on to say that “difficulties over segregation and discrimination exist in every city, in every State of the Union, producing in many cities a rising tide of discontent that threatens the public safety.”  This statement, and others, were important.  But I was mainly moved by these words: “The heart of the question is whether all Americans are to be afforded equal rights and equal opportunities, whether we are going to treat our fellow Americans as we want to be treated.”  After noting the special problem of racial discrimination, he added: “[T]his Nation, for all its hopes and all its boasts, will not be fully free until all its citizens are free.”

He said he planned to ask Congress to enact legislation giving all Americans “the right to be served in facilities…open to the public,” including hotels and restaurants, and to authorize the federal government “to participate more fully in lawsuits designed to end segregation in public education” and “greater protection for the right to vote.”  (His efforts eventually led to the enactment of the Civil Rights Act of 1964.)  He closed by asking for “the support of all our citizens.”

I sat transfixed in front of the TV, totally in awe of this speech, and I became an ardent supporter of the same ideals. 

Thanks to JFK, as a young person I developed a consuming interest in politics, and I began to think about a future where I could be involved in politics in some way.  One possible path occurred to me:  Attending law school and becoming a lawyer.  As I wrote in my handwritten journal in 1958, “I have developed a keen interest in law, and at the moment, I am busily planning to study law if possible.  At one time I believed I would be a writer….  Now, law and politics beckon, and…I am trying to convince myself that nothing is impossible and that if I want it badly enough, I will get it!”  Still a teenager, I wasn’t ready to make the leap to law school, but I did look forward to a future somehow focused on government and politics.  So I majored in political science in college and went on to be a graduate student in that field before abandoning it in favor of law school.

JFK’s assassination on November 22, 1963, traumatized me and probably most other Americans at the time.  It was truly shocking.  Looking back, I realize just how much it affected me.  As I wrote in my handwritten journal on the day after he was assassinated: “When the news of [his] death … was announced, I was too stunned to cry, too horrified to do much of anything but say the words echoed over and over by seemingly everyone…. I can’t believe it!  It’s incredible!  How could anyone do such a thing? And why?”  I added: “I was mourning the personal loss of an individual who had brought such vigor, such excitement, such brilliance, such intelligence, such energy…to everything he ever did in his life.  [He] was a personal icon to me, a hero, a leader to follow…who has always stood, in my eyes, for everything that was right in politics and government, and in the pursuit of power for noble aims, and who, I am certain, played a large part in motivating me…toward a life in politics and government for myself.  The result is perhaps a ‘new’ resolve…my resolve to dedicate my own life, as [he] dedicated his, to what is not always the easiest but what will surely be the most rewarding for me…a life of devoted public service to my country.  If I can, I will pursue legal studies for the next three years to prepare me [or else immediately devote] myself to the ideals of hard work and sacrifice in the public interest.” 

I’d grown up in an era when political assassinations happened only in “banana republics.”  Seeing a young, vital, and inspiring political leader like JFK cruelly shot down changed forever my view of America as a place where political transitions always occur peacefully.  The later assassinations of other American leaders (like Martin Luther King, Jr., and RFK) further traumatized me and others in our country. 

But although I lost him as our president, JFK had motivated me to pursue the study of American politics as well as the study of law.  At a pivotal moment, I chose to leave academia with the goal of becoming an activist via the study of law.  

After graduating from law school, I did become an activist.  I was in the vanguard of lawyers who fought to secure women’s reproductive rights.  My co-counsel and I won a hard-fought victory, invalidating the restrictive Illinois abortion statute in 1971 (Doe v. Scott, 321 F.Supp. 1385 (N.D.Ill. 1971).  As part of that lawsuit, I represented a Black teenage rape victim, winning a TRO in the appellate court that enabled her to have a legal abortion in March 1970.  This lawsuit is the focus of my forthcoming book, tentatively titled On the Barricades.

Throughout my life and my varied career, I’ve maintained my enormous interest in politics, government, and law.  Although I now view myself primarily as a writer, I continue to enthusiastically follow all of that today, whether current trends align with my personal views or not.

I will forever be indebted to JFK for inspiring me to follow this path.

Polls 2024

Are you fed up with polls?  I am.

I’m mad that every time I turn on TV news, I’m confronted with one poll after another.  The media seem obsessed with them, perhaps because they’re desperately trying to come up with fresh stories to fill their nearly endless need to offer viewers something new and different.

I’ve always been leery of polls.  First, I’ve never been asked to be in any poll, and no one I know has either. I’ve always wondered exactly who are the people answering questions in these polls.  Currently, polls seem to be focusing on voters in “swing states” and voters in one demographic group or another.  Maybe I don’t fit into any of those categories.  But I think my views on any number of issues are valuable and should somehow be included in these polls.  Why aren’t they?

Further, I’m quite certain that the people who do participate are often led to answer questions in a given way, thanks to questions that are, in my view, slanted in one direction or another. You’ve probably noticed that, too.

Instead of getting mad, maybe I should take the advice of Ezra Klein, an opinion writer for The New York Times.  On October 13, he published a column, “Ignore the Polls.”  He makes a bunch of good points.  To begin with, he notes that you’re probably looking at polls to know who’ll win.  But the polls can’t tell you that.  On average, polls in every presidential election between 1969 and 2012 were off by two percentage points.  More recent polls, in 2016 and 2020, were off by even more.  Klein states that pollsters today are desperate to avoid the mistakes they made in 2016 and 2020, when they undercounted Trump supporters.  So some of them are asking voters to recall who they voted for in 2020 and then using that info to include Trump voters more accurately this time.  But the results are very different when pollsters don’t ask voters to recall what they did in the past.  According to Klein, voters are “notoriously bad at recalling past votes.”  So why do the pollsters even bother asking?

Klein adds that polls are “remarkably, eerily stable.”  Events keep happening (like assassination attempts and televised debates), but the polls haven’t really changed.  So Klein advises us to give ourselves a break.  “Step off the emotional roller coaster.  If you want to do something to affect the election, donate money or time in a swing state…or volunteer in a local race.  Call anyone in your life who might actually be undecided or might not be registered to vote or might not make it to the polls.  And then let it go.” 

That’s exactly what I’ve been doing.  I’m glad my outlook resembles Ezra Klein’s.  Now if the media would just pay attention to his wise advice.  Hey, media people, ignore the polls.  Instead, seek out interesting stories about the candidates, the voters, and the issues.  Then let it go

Cynicism can be bad for your health

Hey, it’s easy to be cynical these days. 

We’re faced with lie-spouting politicians threatening democratic rule in our country.  We’re confronted by incompetent jerks who make countless mistakes, or even try to scam us, at almost any business we patronize.  And I can’t forget the maniac drivers who weave from lane to lane at illegally high speeds, threatening to kill us every time we’re near them on the freeway. 

But hold on a minute.  A social scientist/author says that having a cynical worldview isn’t such a great idea.  Jamil Zaki wants you to know that having a cynical worldview may have a negative effect on your health. 

In his new book, “Hope for Cynics:  The Surprising Science of Human Goodness,” Zaki concedes that being a cynic can make us feel safer and smarter than the selfish, greedy, and dishonest people in our midst.  But his research at Stanford (where he’s the director of its Social Neuroscience Lab) suggests that it’s much better to become “hopeful skeptics.”  In other words, it’s okay to be critical of troublemakers, but you should also recognize how kind and generous most people really are.

What’s at play here?  Well, we tend to pay more attention to negative events than to positive ones.  This “negativity bias” leads you to remember an occasional driver who cuts you off in traffic while you ignore the countless drivers who are obeying the rules of the road.  Zaki says we should take 15 minutes out of our day and pay attention to the kindness all around us instead of the rudeness you encounter now and then.

Similarly, he recommends that we spread “positive gossip,” pointing out good deeds and kind behavior instead of doing the opposite–spreading mean-spirited gossip about people we dislike.

What’s the benefit of avoiding cynical thought?  You’ll probably feel better about humankind, and that will probably lead to better health.  According to Zaki, the cynical among us are more likely to suffer from depression, heart disease, and feeling burned out.

In the midst of a heated campaign for mayor in San Francisco, one candidate has asked voters to end “the era of cynicism.”  He’s a political novice who has spent much of his personal fortune on philanthropic efforts aimed at improving life in our city, and he’s angry that his opponents have belittled those efforts.  I don’t blame him one bit.  Even though his philanthropy hasn’t always met its goals, the other candidates shouldn’t stoop to cynical bashing.  Instead of criticizing him (as they did in a recent televised debate), they could be praising his attempts to make life better. They could adopt a positive approach and advocate their own ideas for achieving worthwhile goals for our city.  Sadly, the negativity hurled during the debate was so awful that I immediately stopped watching.

As The New York Times book review of Zaki’s book has warned: “Don’t Fall into the ‘Cynicism Trap.”  I don’t plan to, and I hope you won’t either.  Let’s aim for hopeful skepticism.  If we avoid cynicism and instead pay more attention to the kindness around us, we just might feel better.

Watching a new musical on Broadway 50-plus years ago

   

In April 1973, my husband (I’ll call him Marv) and I left our home in Ann Arbor, Michigan, and headed for New York City.  Marv was a terrific math professor at the University of Michigan, and he’d already earned tenure there.  Thanks to recognition by other mathematicians, he was invited to speak at a math conference to be held at NYC’s famed Biltmore Hotel, and I decided to tag along.

A bunch of my law-school classmates were living in NYC just then, and I contacted a few of them about getting together while Marv and I were in town.  One of my favorite classmates was my close friend Arlene, and she immediately made plans to see both of us one evening during our stay.

I was thrilled when Arlene surprised me with a terrific plan.  She was purchasing tickets for all three of us to see a hit musical playing on Broadway.  I’ve always been a huge fan of Broadway musicals, beginning when I was a kid, and I was excited at the prospect of seeing this one.  I may have heard something about it even before we got to NYC, but I didn’t know any details.  In the pre-internet era, it was hard to get details like that.

After a scrumptious dinner somewhere in Manhattan, the three of us set out for Broadway and the musical Arlene had chosen.  We excitedly took our seats in the balcony as the lights dimmed and a hush fell over the audience

As the curtain rose, I gasped. The musical was “Grease,” and it began at a 1950s class reunion at a Chicago public high school.  The graduation year, prominently displayed on the stage, was the same year that Marv and I had graduated from our own public high schools!  As we watched, our mouths agape, we soon figured out that the story focused on the “greasers” at the high school one of its writers attended.

The parallel with our own lives was undeniable.  No, we hadn’t attended schools where “greasers” dominated, but I clearly recalled the students my friends and I jokingly called “hoods”—short for “hoodlums.”  These kids were not terribly different from the working-class teenagers in “Grease.”  My school was dominated by middle-class kids, not the “hoods,” but we were all keenly aware of each other.

It turned out that the musical was first produced in Chicago in 1971, when Marv and I were living in California and totally unaware of local theater in Chicago.  It finally landed in NYC in 1972, about a year before we saw it, and it became the enduring hit we all know. Even better known: The 1978 film version that became a worldwide sensation.  “Grease” went on to earn both Broadway and movie fandom.

The music in the Broadway show we saw that night was astounding:  It borrowed the sounds of early rock-and-roll hits that Marv and I knew and loved.  It’s not surprising that many of the songs in “Grease” remain popular today. 

When the curtain finally came down, the three of us looked at each other.  We had all shared that era in the ‘50s just portrayed on the stage.  I was in a state of shock, trying to recover from the profound experience of reliving a slice of life from our high school days. 

You know what?  I don’t think I’ve ever completely recovered.

Declare your independence: Those high heels are killers

HAPPY JULY!  Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. I’ve revised this post for 2024. My newly revised post follows:


I’ve long maintained that high heels are killers.  I never used that term
literally, of course.  I merely viewed high-heeled shoes as distinctly
uncomfortable and an outrageous concession to the dictates of fashion that can
lead to both pain and permanent damage to a woman’s body. Several years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in
Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they
were in high heels and [had] a hard time getting away quickly.” 

During the past few years, largely dominated by the global pandemic, many women and
men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  Many women also abandoned wearing high heels.  Staying close to home, wearing comfortable clothes, they saw
no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women sought out venues requiring any sort of fancy clothes or footwear.  But when the pandemic began to loosen its grip, some women were tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing was noted in publications like The New York Times and The Wall Street Journal.  According to the Times, some were seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay. I get it.  “Dressing up” may be your thing after a few years of relying on
sweatpants.  But “towering heels”?  They may look beautiful

BUT don’t do it!  Please take my advice and don’t return to wearing the kind
of shoes that will hobble you once again. Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could
afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  Never mind that my feet were
encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store. 

But during one wearing of those heels, the pain became so great that I removed them
and walked in stocking feet the rest of my way home.  After that painful lesson, I abandoned
three-inch high-heeled shoes and resorted to wearing lower ones.  
Sure, I couldn’t flaunt my shapely legs quite as effectively, but I
nevertheless managed to secure ample male attention.  Instead of conforming
to the modern-day equivalent of Chinese foot-binding, I successfully and
happily fended off the back pain, foot pain, bunions, and corns that my
fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a troubling trend toward higher and higher heels.  I was baffled by women, especially young women, who bought
into the mindset that they had to follow the dictates of fashion and the need
to look “sexy” by wearing extremely high heels.  Watching TV, I’d see too
many women wearing stilettos that forced them into the ungainly walk I briefly
sported so long ago.  Women on late-night TV shows who were otherwise
smartly attired and often very smart (in the other sense of the word) wore
ridiculously high heels that forced them to greet their hosts with that same
ungainly walk.  Some appeared to be almost on the verge of toppling
over. Sadly, this phenomenon has reappeared. 

Otherwise enlightened women are once again appearing on TV wearing absurdly high heels.  Even one of my
favorite TV journalists, Stephanie Ruhle, has appeared on her “11th Hour” program on MSNBC in stilettos. 
C’mon, Steph!  Don’t chip away at my respect for you.  Dump those stilettos!

What about the women, like me, who adopted lower-heeled shoes instead of following
fashion?  I think we’re much smarter and much less likely to fall on our
faces.  One very smart woman who’s still a fashion icon agreed with us
long ago: the late Hollywood film star Audrey Hepburn. Audrey dressed smartly,
in both senses of the word. I recently watched her 1963 smash film Charade for the
tenth or twelfth time. I once again noted how elegant she appeared in her
Givenchy wardrobe and her–yes–low heels. Audrey was well known for wearing
comfortable low heels in her private life as well as in her films.  In Charade, paired with
Cary Grant, another ultra-classy human being, she’s seen running up and down
countless stairs in Paris Metro stations, chased by Cary Grant not only on
those stairs but also through the streets of Paris. She couldn’t have possibly
done all that frantic running in high heels!

Foot-care professionals have soundly supported my view.   According to the
American Podiatric Medical Association, a heel that’s more than 2 or 3 inches
makes comfort just about impossible.  Why?  Because a 3-inch heel
creates seven times more stress than a 1-inch heel. A noted podiatrist and foot and ankle surgeon has explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to
“ball-of-the-foot numbness.” (Yikes!)  He advised against wearing
3-inch heels and pointed out that celebrities wear them for only a short time,
not all day.  To ensure a truly comfortable shoe, he added, no one should
go above a 1.5-inch heel. 

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of
America’s major shoe-sellers, began to promote lower-heeled styles. Although
stilettos hadn’t disappeared from the scene, they weren’t the only
choices.  I was encouraged because Nordstrom is a bellwether in the
fashion world, and its choices can influence shoe-seekers.  Then the
pandemic arrived and changed shoe-purchasing.  During the first year,
sales of high heels languished, “teetering on the edge of extinction,”
according to the Times.  But because the pandemic has now dissipated to a large extent, some women may have resurrected the high heels already in their closets.  They may even be inspired to buy
new ones.  I hope they don’t.

There is heartening news from bellwether Nordstrom.  In a recent catalog, two
pages featured nothing but sneakers.  Other pages displayed nothing but
flat-heeled shoes and “modern loafers.” Stilettos were nowhere to be seen.

Let’s not forget the Gen Z generation.  Most Gen Z shoppers don’t follow the
dictates of fashion. They largely eschew high heels, choosing pricey and often
glamorous sneakers instead–even with dressy prom dresses.

My own current faves: I wear black Skechers almost everywhere (I own more than one
pair). I occasionally choose my old standby, Reeboks, for serious walking. (In
my novel Red Diana, protagonist Karen Clark laces on her Reeboks for a lengthy jaunt, just as I do.)  I
recently bought a pair of Ryka sneakers–so far so good. And in warm weather, I
wear walking sandals, like those sold by Clarks, Teva, and Ecco.

Any women who are pondering buying high-heeled shoes should hesitate.  Beyond
the issue of comfort and damage to your feet, please remember that high heels
present a far more serious problem.  As the deaths in
Riverside demonstrate, women who wear high heels may be putting their lives
at risk.
  When they need to flee a dangerous situation, high heels can
handicap their ability to escape. How many needless deaths have
resulted from hobbled feet?
 

The Fourth of July is fast approaching.  As we celebrate the holiday this
year, I once again urge the women of America to declare their independence from high-heeled shoes. If
you’re thinking about returning to painful footwear, think again.  You’d be wise to reconsider.I urge you to bravely gather any high heels you’ve been clinging to and throw
those shoes away.  At the very least, keep them out of sight in the back
of your closet.  And don’t even think about buying new ones.  Shod
yourself instead in shoes that allow you to walk in comfort—and if need
be, to run. Your wretched appendages, yearning to be free, will be forever grateful.



 



Does anyone still iron?

I like to stay abreast of the news.  To get current info, I’ve purchased online subscriptions to newspapers like The Washington Post.

The Post is currently in a state of upheaval, reeling from revelations about editors and others who run the paper.  But it’s still churning out plenty of news stories.  I like to peruse the list of stories every day, choosing the ones I want to read and deleting the rest.

One of the things I don’t look for in the Post is advice on how to do household chores. I’ve never been much of a fanatic about housekeeping.  Probably a predictable reaction to my mother’s
obsession with it.  I do keep things clean, and I can usually track down what I need.  My sister, on the other hand, happily followed my mother’s path.  When she listed her house for sale, the listing described her place as “impeccably maintained.”  I couldn’t help joking that my house was, by contrast, “peccably maintained.”  (I don’t think that’s a real word.)

Noting the chaos going on behind the scenes at the Post, I was recently astounded to come across advice on “how to iron better and faster.”  A long column set out “ironing tips and tricks.”

My reaction?  I don’t iron!  I haven’t ironed anything in years.  Does anyone still iron?

After moving to San Francisco 19 years ago, leaving behind all sorts of things I used in my former home, I purchased a new steam iron at Macy’s.  My apartment has a built-in ironing board, and I guess I expected to use this iron someday. But I never have.  It languishes in its box, resting on a shelf, eagerly waiting to confront some wrinkled clothes.  The funny thing is that I never wear clothes that need ironing.

I can’t help remembering how my mother was addicted to an old-fashioned heavy non-steam iron she must have acquired in the 1950s.  When I finally made a little money in my newly-launched legal career, I gave her the gift of a brand-new steam iron purchased at the old Sears department store on State Street in downtown Chicago.  Incredibly, she forced me to return it!  To my amazement, she clearly preferred her long-established habit of wielding that heavy iron and watching all manner of
clothes turn wrinkle-free under her watch. 

I confess that in my long-ago past, I did iron a few items of clothing.  I specifically recall ironing
the white cotton blouses we all wore back then. But I happily left cotton blouses behind years ago.  When I worked as a lawyer, I sometimes wore silk blouses that needed special care, but I sent
those to a cleaners rather than tackle them myself.

Today my wardrobe is filled with very little besides t-shirts and jeans and black pants that don’t need ironing.  I can’t imagine standing in front of an ironing board handling a dangerously hot appliance that does nothing more than remove wrinkles. I view that as a tremendous waste of my time.

If you choose to wear clothes that need ironing, I certainly respect your choice.  You must prefer to wear
clothes very different from mine.  Maybe you’d like to read the advice by Post columnist Helen Carefoot.  In her column, she covers topics like “how to use the ironing board correctly.”  Good luck to you in that pursuit! 

Please forgive me if I don’t join you.  I choose to avoid that particular pursuit.  I’m quite fulfilled wearing my t-shirts and non-ironed pants while I take a hike outdoors, watch the latest must-see on TV, or sit in front of my desktop computer, happily typing away.



 



Hollywood’s take on unwanted pregnancies

The current turmoil over abortion rights arose after the U.S. Supreme Court reversed Roe v. Wade two years ago.  But the problems created by unwanted pregnancies have been around for generations, since long before Roe v. Wade made legal abortions possible in the U.S. in 1973. 

A Place in the Sun was a powerful 1951 Hollywood film highlighting the problem. Starring Montgomery Clift and Ellizabeth Taylor, the film featured Shelley Winters as a hapless young woman whose unwanted pregnancy led to disastrous consequences. Based on the Theodore Dreiser novel An American Tragedy, the film dramatized a real-life story dating back to 1906.  I’ve watched this film many times, and although I felt sympathy for Shelley Winters’s pathetic character, I never related to her.

A much later Hollywood film openly dealt with the subject of abortion in 1963.  Love With the Proper Stranger featured two Hollywood superstars during that era, Natalie Wood and Steve McQueen (both of whom coincidentally met untimely deaths in 1980/81).  It became a huge box-office hit in 1963, and it’s worth revisiting today.  

In the film, Natalie Wood (as “Angela”) and Steve McQueen (as “Rocky”) confront the abortion question head-on.  Rocky is a jazz musician seeking a gig at a union hiring hall in NYC when Angela suddenly appears.  Steve McQueen had just starred in The Great Escape and a bunch of popular Western films, but he reportedly wanted to play a different kind of character in a different kind of film.  Natalie Wood’s career was thriving, and she probably relished playing a sharp young woman who boldly chooses to confront the one-night stand who’s caused her a serious problem—an unwanted pregnancy.

Angela’s life is constrained by her oppressive family. She’s “choking to death” in their small apartment, constantly vowing to escape. Now, unhappily pregnant, she tells Rocky, “All I want from you is a doctor.”

After some hesitation, Rocky tracks down the name of a doctor who charges $400 for an abortion, and he agrees to pay half.  The two of them arrive at the location where they’ve been told to bring the money, but the lowlife they meet demands another $50.  (Please note: Angela is wearing a dress and high-heeled shoes, an outfit that looks absurd when viewed today. This is what Hollywood moguls must have thought women wore to their illegal abortions in the 1960s.)

The couple has scraped up the original $400 fee with difficulty, so they resort to getting the extra $50 from Rocky’s family. They finally make their way to the doctor’s address, a run-down apartment where Angela shakily begins to undress.  But the abortionist is a not an MD, just a rude woman with scary-looking things in a suitcase.  Angela is shocked and begins to sob, fearful of what might happen to her.  Rocky bursts in, and they escape together, Rocky bravely announcing “I’ll kill them before I let them touch you.”

Their budding romance has its ups and downs as they deal with Angela’s family and a prospective suitor her mother pushes on her.  But Rocky finally realizes that he loves Angela, and he asks her to marry him.  Thus we have a typical Hollywood “happy ending.”  Except that this couple has shared a horrific run-in with the illegal abortion industry that existed in NYC in 1963. 

Love With the Proper Stranger offered a cautionary tale for its audience, including a young woman like me.  When I saw this movie, I was a naïve student hovering between college and law school. Although I was dating a variety of suitors, I wasn’t as sexually active as many other women my age. (I was what we called a “good girl”.)  Still, I could easily see myself in Angela’s appalling situation, confronting an unwanted pregnancy sometime in the future.  And it certainly struck me as unfair that it was the woman who had to deal with this situation while her partner could escape without any consequences.

Four years later, I graduated from law school with the goal of helping minorities and women achieve the justice often denied them in the U.S. at that time.  So when I began work in a job that enabled me to challenge the constitutionality of the restrictive Illinois abortion statute, I seized the opportunity to effect change and, with my co-counsel, took on that challenge.

Did seeing the film, Love With the Proper Stranger, influence me in any way?  Specifically, did it influence me to become a lawyer who challenged that restrictive law? 

Maybe.

In retrospect, I think I was influenced by a great many things in our culture.  Including Hollywood movies.

A much more recent movie similarly addresses the deplorable absence of abortion rights in 1963:  The 2021 French film, Happening, based on the 2000 novel with the same title by the 2022 Nobel Prize-winner in literature, Annie Ernaux.  Ernaux’s novel, and the film adapted from it, dramatize the real-life experiences she endured (coincidentally in 1963) when, as a promising young college student, she was faced with an unwanted pregnancy.  Both the film and the book depict her repeated attempts to secure a safe abortion, thwarted by the harsh anti-abortion law governing French women at that time. 

Happening is a far more sophisticated version of this story than the 1963 U.S. version. It garnered outstanding reviews by prominent film critics worldwide. Anyone viewing it lives through exactly what women at every level in French society confronted when they tried to live a meaningful life free from the cruel and antiquated views of abortion by those in leadership positions in the French government. 

Like the story in Love With the Proper Stranger, it’s a story as vivid to us today as it was to those of us who fought against the harsh laws depriving women of their reproductive freedom in the past.  In 2024, we must vow to re-fight those fights whenever and wherever our reproductive rights are denied.

“A Raisin in the Sun”

The enduring acclaim for the play “A Raisin in the Sun,” as well as its film version, has inspired me to relate what happened when I saw the play for the very first time. 

During 1959, this stunning new play about a Black family in Chicago, written by the exciting young playwright Lorraine Hansberry, premiered at an upscale downtown Chicago theater, the Blackstone Theatre.  Although histories of the play often state that it had its premiere on Broadway in New York City, it actually appeared earlier in Chicago.

The sometimes-caustic theater critic for the Chicago Tribune, Claudia Cassidy, wrote an enthusiastic review of it on February 11, 1959, noting that it was “a remarkable new play” that was “still in tryout.”

“Raisin” represented an enormous theatrical leap because of its plot– a realistic portrayal of a Black family in Chicago confronted with a crucial decision–and because of the brilliant performances by its actors, including Sidney Poitier and Ruby Dee.

I was lucky to see “Raisin” during its pre-Broadway stay in Chicago.  As a Chicago public high-school student with limited funds, I saw it as an usher.

Ushering was a fairly casual affair in those days.  Often accompanied by a friend or two, I would simply show up at a theater about an hour before the curtain went up and ask the usher-captain whether she could use another usher.  The answer was invariably “yes,” and I would be assigned to a designated area in the theater where I would check tickets and seat ticket-holders. Ushering enabled me to see a great many plays and musicals at no cost whatsoever, and I ushered as often as my school’s schedule allowed.

I’ve never forgotten the startling incident that occurred during the matinee performance of “Raisin” I viewed as an usher.  In the midst of the performance, for no apparent reason, the actors suddenly stopped speaking.  The reason became clear when the theater manager strode onto the stage.  Bottling his rage, he explained that the actors had been struck by items thrown at the stage by patrons in the theater. 

I was shocked to learn of this extremely disrespectful behavior.  I’d never witnessed a problem of any kind created by audience members.

I concluded (fairly, in my opinion) that the audience must have included a number of boorish high-school students sitting in the balcony that afternoon thanks to “comp” tickets.  Some of them were undoubtedly displaying the bigoted attitude toward Black people that prevailed in their homes.

The Chicago area’s population at that time included large numbers of white people who were biased against Blacks.  Some of these whites felt threatened by any possibility of change in their communities.  Some later openly demonstrated to protest Dr. Martin Luther King Jr’s visit to Chicago. 

Here, in an upscale downtown theater, was the ugly and ignorant result of this bias.

Has anything changed since 1959?  For a long time, I thought it had.  During my years as a public interest lawyer and, later, as a law school professor and writer, I worked toward and believed in meaningful progress in the area of civil rights.  I had hoped that this feeling by some white people that they were threatened by Blacks–and eventually by Browns as well—had decreased.

Sadly, our recent history has revealed that this feeling still exists. It’s even been encouraged by certain “leaders’ in the political arena.  Some predict that violence could be the ultimate outcome.

I worry that we’re edging toward a return to the ethos of 1959 and the hostility displayed during the performance of “A Raisin in the Sun” I saw back then.  I fervently hope that this will not, indeed cannot, happen and that most Americans vehemently reject the prospect that it will.