Category Archives: The New York Times

Marlon, Tony, and Cyd

Thanks to the cable TV channel Turner Classic Movies (TCM), I frequently watch a wide range of movies produced from the late ‘30s to those in the 21st century.

Some of my favorites are movies from the 1950s.  One highlight is the 1955 film Summertime, featuring Katharine Hepburn as a single woman who finds love while touring Venice on her own. Shot on location in Venice, it’s not your typical romantic movie, surpassing that genre with Hepburn’s brilliant performance and its glorious setting.

Among many other films from the ‘50s, I recently came across the 1955 Hollywood version of the 1950 Broadway blockbuster musical Guys and Dolls.  I’d seen it before but not for decades, and the TCM introduction by host Ben Mankiewicz was intriguing.  He noted that the film’s director, Joe Mankiewicz (Ben’s uncle), induced Marlon Brando to take the role of the leading man (Sky Masterson) despite Brando’s reluctance to assume a role in a musical. 

Joe reportedly told Marlon that he’d never directed a musical before, but, hey, they’d worked well together one year earlier when Joe directed the film version of Julius Caesar, and neither of them had ever done Shakespeare in a film before. As we know, Julius Caesar was a success, and Joe convinced Marlon that they’d also succeed together in a musical.

Although I enthusiastically agree that they both performed at the top of their game in Julius Caesar, their later collaboration in a musical was less than totally successful.

Filled with catchy tunes composed by the great Frank Loesser, the movie is exuberant, probably as far as a movie musical can go.  But one enormous weakness is Marlon’s lack of vocal ability.  His part requires that he sing a host of major songs, but his voice just isn’t up to them.

(By the way, Frank Sinatra was reportedly angling for this role and not happy about being given the secondary part of Nathan Detroit.)

One of the most obvious examples of Marlon’s poor vocal ability is his rendition of “Luck Be a Lady,” a show-stopping musical number on Broadway. 

When I watched Marlon’s pitiful attempt to master it, I was flooded with memories of first hearing this song performed—live—by singer Tony Martin at the Flamingo Hotel in Las Vegas.

I was a kid when my family and I arrived in Las Vegas en route from Chicago to Los Angeles.  We’d left our life in Chicago behind, hoping to find a new life for all of us in LA.  Our move was prompted by my father’s serious illness, which we optimistically believed was cured, and his hope to establish a new life for our family in sunny LA.

I was delighted by our departure.  I knew I’d miss my friends in Chicago, who memorably gave me a surprise farewell party featuring a cake emblazoned with “California, Here Comes Sue” (my preferred nickname at the time).  But I was excited about forging a new life on the West Coast, where I fervently hoped that Daddy would be healthy and able to forge a new career.  Sadly, that wasn’t to be.  (I plan to write about that period in my life another time.)

Many of you may be wondering, “Who was Tony Martin?”

Although Tony Martin has faded into our cultural background today, he was a prominent American singer and film actor during most of the 20th century.  Born in San Francisco and raised in Oakland, Tony began his musical career with a local orchestra until he left for Hollywood in the mid-‘30s.  He appeared on radio programs like Burns & Allen, then moved on to films, where he starred in a number of musicals and received equal billing with the Marx Brothers in their final film, The Big Store.  After serving during WWII, he came back to the U.S., recorded memorable songs for Mercury and RCA records (including some million-sellers), and returned to Hollywood to star in film musicals in the ‘40s and ‘50s.  He also began performing in Las Vegas and other venues and continued to perform live till he was over 90.  (The NY Times reported that he performed at Feinstein’s on Park Avenue in NYC at the age of 95.)

Before dying at 98 in 2012, Tony was truly a fixture in Hollywood films, recorded music, TV appearances, and as a headliner in live concert performances for seven decades.  In the public mind, he’s been eclipsed by another Tony—Tony Bennett–who became successful during the ‘50s recording hits like “Because of You” and “Rags to Riches.”  His rendition of 1962’s “I Left My Heart in San Francisco” became his signature song and made him a hero in San Francisco (although it was Tony Martin who was actually born in SF).  Tony Bennett, perpetuating his role as a celebrated singer of pop standards, jazz, and show tunes, has become something of a cultural touchstone.  Despite his recent battle with Alzheimer’s, his popularity endures.  I can’t deny that his prominent place in the American musical landscape has lasted far longer than Tony Martin’s.

Back to my story…. 

Our family was staying at an inexpensive motel on the Las Vegas Strip, but Daddy had grand plans for us.  He succeeded in getting us front-row tickets for Tony Martin’s memorable performance at the Flamingo, a luxury hotel on the Strip.

The Flamingo Hotel itself is noteworthy.  As the 1991 film “Bugsy” (starring Warren Beatty as Bugsy Siegel) and, more recently, the 2021 film “Lansky” (featuring Harvey Keitel as Meyer Lansky) make clear, Ben “Bugsy” Siegel and Meyer Lansky were major figures in organized crime who funded the construction of the Flamingo Hotel in the late forties.  It was finally completed in 1947 around the time Bugsy was shot to death by his fellow mobsters, who believed him guilty of skimming money. 

I knew nothing of this history until many years later.  When I was a kid, all I knew was that I got to see and hear Tony Martin live at the Flamingo.  I absolutely reveled in being part of the audience that night, watching Tony perform.

When Tony sang “Luck Be a Lady,” he lighted up the stage, and the audience responded enthusiastically. I recall being completely enthralled. 

Marlon’s performance in Guys and Dolls wasn’t in the same league.

At the same time that Tony was executing this song far better than Marlon ever could, Tony’s wife, dancer Cyd Charisse, was making her own mark in Hollywood.  Tony and Cyd married in 1948, and their six-decade marriage ended only with Cyd’s death in 2008. 

Cyd was an astounding dancer in a raft of Hollywood films, paired with both Gene Kelly (in Brigadoon, for one) and Fred Astaire.  Her dance number with Astaire in The Band Wagon (to the song “Dancing in the Dark”) has been immortalized in 1994’s That’s Entertainment III.  And if you watch 1957’s Silk Stockings (a musical version of Garbo’s Ninotchka), your eyes are riveted on her fantastic dancing, which outdoes Astaire’s in every way.  (By the way, Cyd’s comments in her autobiography on dancing with Kelly and Astaire are fascinating.)

Was Cyd in the audience that night, sharing her husband’s fabulous performance with the rest of us?  I’ll never know.  But it’s exciting to imagine that she was there, applauding with gusto, just as we did, to pay tribute to Tony’s outstanding rendition of “Luck Be a Lady.”

It goes without saying that Marlon Brando was a brilliant actor, one of the most remarkable actors of his generation.  His performances in films like On the Waterfront, A Streetcar Named Desire, The Godfather, and, for that matter, Julius Caesar, will remain in our cultural memory as long as films endure. 

But notably, after playing Sky Masterson in Guys and Dolls, Marlon never attempted another singing role.  

Declare Your Independence: Those high heels are killers

Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. 

I’ve revised this post in light of changes that have taken place during the past year and a couple of new ideas I want to pass along.

My newly revised post follows:

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body. 

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.” 

During the past two years, largely dominated by the global pandemic, many women and men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  [Please see my post, “Two Words,” published July 15, 2020, focusing on pants with elastic waists.]

In particular, many women abandoned the wearing of high heels.  Staying close to home, wearing comfortable clothes, they saw no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women chose to seek out venues requiring any sort of fancy clothes or footwear.  

But as the pandemic began to loosen its grip, some women were tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing was noted in publications like The New York Times and The Wall Street Journal.   In a story in the Times, one woman “flicked the dust off her…high-heeled lavender pumps” that she’d put away for months and got ready to wear them to a birthday gathering.  According to the Times, some are seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay.  I get it.  “Dressing up” may be your thing after a couple of years relying on sweatpants.  But “towering heels”?  They may look beautiful, they may be alluring….

BUT don’t do it!  Please take my advice and don’t return to wearing the kind of shoes that will hobble you once again..

Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I nevertheless managed to secure ample male attention. 

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a trend toward higher and higher heels, and I found it troubling.  I was baffled by women, especially young women, who bought into the mindset that they had to follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.  

When I’d watch TV, I’d see too many women wearing stilettos that forced them into the ungainly walk I briefly sported so long ago.  I couldn’t help noticing the women on late-night TV shows who were otherwise smartly attired and often very smart (in the other sense of the word), yet wore ridiculously high heels that forced them to greet their hosts with that same ungainly walk.  Some appeared to be almost on the verge of toppling over. 

Sadly, this phenomenon has reappeared. On late-night TV, otherwise enlightened women are once again wearing absurdly high heels.

So…what about the women, like me, who adopted lower-heeled shoes instead?  I think we’ve been much smarter and much less likely to fall on our faces. One very smart woman who’s still a fashion icon: the late Hollywood film star Audrey Hepburn. Audrey dressed smartly, in both senses of the word.

I recently watched her 1963 smash film Charade for the eighth or tenth time. I especially noted how elegant she appeared in her Givenchy wardrobe and her–yes–low heels. Audrey was well known for wearing comfortable low heels in her private life as well as in her films. [Please see my blog post: https://susanjustwrites.com/2013/08/08/audrey-hepburn-and-me/….]

In Charade, paired with Cary Grant, another ultra-classy human being, she’s seen running up and down countless stairs in Paris Metro stations, chased by Cary Grant not only on those stairs but also through the streets of Paris. She couldn’t have possibly done all that frantic running in high heels!

Foot-care professionals have soundly supported my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

A few years ago, the San Francisco Chronicle questioned a podiatrist and foot and ankle surgeon who practiced in Palo Alto (and assisted Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He did not endorse wearing 3-inch heels and pointed out that celebrities wear them for only a short time, not all day.  To ensure a truly comfortable shoe, he added, no one should go above a 1.5-inch heel.  If you insist on wearing higher heels, you should limit how much time you spend in them.

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of America’s major shoe-sellers, began to promote lower-heeled styles along with higher-heeled numbers.  I was encouraged because Nordstrom is a bellwether in the fashion world, and its choices can influence shoe-seekers.  At the same time, I wondered whether Nordstrom was reflecting what its shoppers had already told the stores’ decision-makers.  The almighty power of the purse—how shoppers were choosing to spend their money–-probably played a big role.

The pandemic may have changed the dynamics of shoe-purchasing, at least at the beginning. For the first year, sales of high heels languished, “teetering on the edge of extinction,” according to the Times.  Today, the pandemic may be a somewhat less frightening presence in our lives, and there are undoubtedly women who will decide to resurrect the high heels already in their closets.  They, and others, may be inspired to buy new ones.

I hope these women don’t act in haste.  Beyond the issue of comfort, let’s remember that high heels present a far more serious problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When they need to flee a dangerous situation, high heels can handicap their ability to escape.

How many needless deaths have resulted from hobbled feet?

Gen Z shoppers can provide a clue to the future. They largely eschew high heels, choosing glamorous sneakers instead–even with dressy prom dresses.

My own current faves: I wear black Sketchers almost everywhere. I occasionally choose my old standby, Reeboks, for serious walking. [In my novel Red Diana, protagonist Karen Clark laces on her Reeboks for a lengthy jaunt, just as I do.] And when warm temperatures dominate, I’m wearing walking sandals, like those sold by Clarks, Teva, and Ecco.

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I once again urge the women of America to declare their independence from high-heeled shoes. 

If you’re currently thinking about returning to painful footwear, think again.  You’d be wiser to reconsider.

I encourage you to bravely gather any high heels you’ve clung to during the pandemic and throw those shoes away.  At the very least, keep them out of sight in the back of your closet.  And don’t even think about buying new ones.  Shod yourself instead in shoes that allow you to walk in comfort—and if need be, to run.

Your wretched appendages, yearning to be free, will be forever grateful.

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

The wage gap is still enormous

You’re probably wondering.  Wage gap?  Huh?

This isn’t a sexy topic, but it’s troubling, and it’s not the first time I’ve written about it.  There are certainly more compelling topics to discuss right now (e.g., gun safety, the persistence of Covid), but I want to focus on this today.

 Five years ago, in July 2017, I noted my concern with the CEO-worker wage gap [https://susanjustwrites.com/2017/07/31/random-thoughts-ii/].

What was bothering me?  The CEO “pay ratio” was standing at 271-to-1.

I was looking at the Economic Policy Institute’s annual report on executive compensation released on July 20, 2017.  According to that report, chief executives of America’s 350 largest companies made an average of $15.6 million in 2016, or 271 times more than what the typical worker made last year.

Yes, the number was slightly lower than it was in 2015, when the average pay was $16.3 million, and the ratio was 286-to-1.   And it was even lower than the highest ratio calculated, 376-to-1 in 2000.

But, as I pointed out, before we popped any champagne corks because of the slightly lower number, we had to remember that in 1989, after eight years of Ronald Reagan in the White House, the ratio was 59-to-1, and in 1965, in the midst of the Vietnam War and civil rights turmoil, it was 20-to-1.

In 2017, I wanted us to reflect on those numbers.  To think about how distorted these ratios were and what they said about our country.  I asked, “Did somebody say ‘income inequality’?”

Why am I writing about this issue again?  Because this week Andrew Ross Sorkin reported in The New York Times that the average pay gap between low-wage workers and the CEOs of their companies is still enormous.

Sorkin reported that, according to a brand-new study by the Institute for Policy Studies, the median pay for workers at companies that tend to pay low wages was, thanks to inflation,  up by 17 percent,.  But that raise was dwarfed by the rise in CEO pay, which rose by 30 percent at those same companies.  The lead author of the study “Executive Excess” noted, “this could have been a time when companies used rising profits to level the playing field.  Instead,” said Sarah Anderson, “we haven’t seen a very big shift in pay equity.”

Further, CEOs did even better at companies where salaries didn’t keep pace with inflation. The study looked at median workers’ wages at about a third of the firms in the study, firms whose wages did not keep pace with inflation.  The average CEO pay at those companies was up by 65 percent, or more than double the increase at all of the firms in the study.

One company in this group was Best Buy, where median pay fell two percent last year (to $29,999), while the CEO, Corie Barry, got a 30 percent pay increase to $15.6 million.  Barry may have done a bang-up job, but the huge difference in pay is pretty stark.

Hey, Best Buy, I just bought some stuff from you.  If I’d known that my purchases have led to this vast inequity in pay, I’d have thought twice about giving my business to you.  I don’t like to think that such a big chunk of your profits, including those derived from customers like me, went straight to your CEO instead of to your workers.

Is there any possibility for change?  There may be a glimmer of hope.  Sorkin’s report also noted that the SEC (Securities and Exchange Commission) could possibly move in that direction.  According to Sorkin, a group of former regulators (including two former SEC commissioners) have asked the SEC to issue new rules illuminating this disparity. 

The petition for this rule-change contends that “investors need more information about what companies pay workers,” and it urges the SEC to propose new rules requiring companies to disclose how much they’re investing in their workforces.

The two former SEC commissioners (Joseph Grundfest and Robert Jackson) have, in the past, often had opposing views.  They noted, “We differ in our views about the regulation of firms’ relationships with their employees generally.”  But, they added, “we all share the view that investors need additional information.”  The group stated that the current accounting and tax rules make “investing in machines more attractive than spending on humans.

Right now only about 15 percent of public companies disclose their labor costs. The proposed rules would require that companies disclose their labor costs (and no longer lump them in with other expenses).  They’d also require companies to provide detailed workforce compensation data, including information on the breakdown for contract, part-time, and full-time employees.

So we may be able to clearly see the current disparity in compensation.  If these new rules are endorsed by the SEC, we could see much more transparency in workers’ compensation because data revealing who earns how much would be revealed for everyone, including investors, to see.  

At least some investors could then make choices that would benefit workers’ compensation.

The goal is achieving greater equity.  I think that many if not most investors would welcome a move in that direction.  As Virginia’s Senator Mark Warner, who supports the petition, says, “No one can credibly argue that this type of disclosure wouldn’t be valuable or material to investors in a highly competitive, 21st-century, global economy.”

Declare Your Independence: Those high heels are killers

Following a tradition I began several years ago, I’m once again encouraging women to declare their independence this July 4th and abandon wearing high-heeled shoes. 

I’ve revised this post in light of changes that have taken place during the past year.

My newly revised post follows:

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body. 

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.” 

During the past year, one dominated by the global pandemic, many women and men adopted different ways to clothe themselves.  Sweatpants and other comfortable clothing became popular.  [Please see my post, “Two Words,” published July 15, 2020, focusing on wearing pants with elastic waists.]

In particular, many women abandoned the wearing of high heels.  Staying close to home, wearing comfortable clothes, they saw no need to push their feet into high heels.  Venues requiring professional clothes or footwear almost disappeared, and few women chose to seek out venues requiring any sort of fancy clothes or footwear.  

As the pandemic has loosened its grip, at least in many parts of the country, some women have been tempted to return to their previous choice of footwear.  The prospect of a renaissance in high-heeled shoe-wearing has been noted in publications like The New York Times and The Wall Street Journal.   In a recent story in the Times, one woman “flicked the dust off her…high-heeled lavender pumps” that she’d put away for months and got ready to wear them to a birthday gathering.  According to the Times, some are seeking “the joy of dressing up…itching…to step up their style game in towering heels.”

Okay.  I get it.  “Dressing up” may be your thing after more than a year of relying on sweatpants.  But “towering heels”?  They may look beautiful, they may be alluring….

BUT don’t do it!  Please take my advice and don’t return to wearing the kind of shoes that will hobble you once again..

Like the unfortunate young women in Riverside, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I nevertheless managed to secure ample male attention. 

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters often suffer in spades.

Until the pandemic changed our lives, I observed a trend toward higher and higher heels, and I found it troubling.  I was baffled by women, especially young women, who bought into the mindset that they had to follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.  

When I’d watch TV, I’d see too many women wearing stilettos that forced them into the ungainly walk I briefly sported so long ago.  I couldn’t help noticing the women on late-night TV shows who were otherwise smartly attired and often very smart (in the other sense of the word), yet wore ridiculously high heels that forced them to greet their hosts with that same ungainly walk.  Some appeared to be almost on the verge of toppling over. 

On one of the last in-person Oscar Awards telecasts (before they became virtual), women tottered to the stage in ultra-high heels, often accompanied by escorts who kindly held onto them to prevent their embarrassing descent into the orchestra pit.

So…what about the women, like me, who adopted lower-heeled shoes instead?  I think we’ve been much smarter and much less likely to fall on our faces.

Foot-care professionals have soundly supported my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

A couple of years ago, the San Francisco Chronicle questioned Dr. Amol Saxena, a podiatrist and foot and ankle surgeon who practiced in Palo Alto (and assisted Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He did not endorse wearing 3-inch heels and pointed out that celebrities wear them for only a short time, not all day.  To ensure a truly comfortable shoe, he added, no one should go above a 1.5-inch heel.  If you insist on wearing higher heels, you should limit how much time you spend in them.

Before the pandemic, some encouraging changes were afoot.  Nordstrom, one of America’s major shoe-sellers, began to promote lower-heeled styles along with higher-heeled numbers.  I was encouraged because Nordstrom is a bellwether in the fashion world, and its choices can influence shoe-seekers.  At the same time, I wondered whether Nordstrom was reflecting what its shoppers had already told the stores’ decision-makers.  The almighty power of the purse—how shoppers were choosing to spend their money–probably played a big role.

But the pandemic may have completely changed the dynamics of shoe-purchasing.  Once we faced the reality of the pandemic, and it then stuck around for months, sales of high heels languished, “teetering on the edge of extinction,” according to the Times

Today, with the pandemic a somewhat less frightening presence in our lives, there are undoubtedly women who will decide to resurrect the high heels already in their closets.  They, and others, may be inspired to buy new ones, dramatically changing the statistics—and their well-being.

I hope these women don’t act in haste.  Beyond the issue of comfort, let’s remember that high heels present a far more serious problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When they need to flee a dangerous situation, high heels can handicap their ability to escape.

How many needless deaths have resulted from hobbled feet?

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I once again urge the women of America to declare their independence from high-heeled shoes. 

If you’re currently thinking about returning to painful footwear, think again.  You’d be wiser to reconsider.

I encourage you to bravely gather any high heels you’ve clung to during the pandemic and throw those shoes away.  At the very least, please keep them out of sight in the back of your closet.  And don’t even think about buying new ones.  Shod yourself instead in shoes that allow you to walk in comfort—and if need be, to run.

Your wretched appendages, yearning to be free, will be forever grateful.

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

RBG in ’72

Countless words have been, and will continue to be, written about the incomparable U.S. Supreme Court Justice Ruth Bader Ginsburg, who served on the high court for 27 years.

I will leave discussions of her tenure on the Court to others.

What I will do here is recount the one and only time I encountered her in person, at a law school conference, at a pivotal point in her career.  If you’re interested in learning about that encounter, please read on.

In September of 1972, I was a full-time faculty member at the University of Michigan (UM) Law School.  Notably, I was the only full-time faculty member who was a woman.

The law school had a desirable setting on the UM campus, whose multitude of elm trees were unfortunately denuded of leaves, thanks to Dutch elm disease. The law school buildings made up the stunning Law Quadrangle, featuring beautiful old buildings constructed in the English Gothic style.

My role on the faculty was to help first-year law students learn the basics of legal education:  how to analyze court rulings (the kind they would read in the books assigned to them in courses like Torts and Contracts); how to do their own research into case law; and how to write a readable legal document, like an appellate brief aimed at persuading an appellate court to decide in their favor.

I was one of four young lawyers hired to fill this role.  The three men and I each taught one-fourth of the first-year class.  As I recall, we got to choose our offices in the law school library, and I immediately chose a plum.  It was an enormous wood-paneled room with charming hand-blown stained glass windows.  One entered it via a stairway leading upstairs from the library’s impressive reading room.  I treasured my office and happily welcomed meeting with students there.  And I wonder, in light of renovations at the law school, whether that glorious office still exists.

At some point early that fall, I learned that a conference on “women and the law” would be held at the New York University School of Law in October.  This was a bold new area of law that most law schools didn’t consider worth their attention.  NYU was clearly an exception. 

The idea of the conference immediately grabbed my attention because I had a longstanding interest in its stated focus.  One reason why I had myself attended law school a few years before was that, beginning very early in my life, I was and remain concerned with achieving equity and justice, including equal rights for women.

This focus had led me to attend law school during the mid-’60s.  My first job was that of law clerk to a U.S. district judge in Chicago.  After finishing my clerkship, I became a practicing lawyer as a Reggie assigned to my first choice, the Appellate and Test Case Division of the Chicago Legal Aid Bureau.  [I discussed the Reggie program in a blog post, “The Summer of ’69,” published on August 7, 2015.]

And so, three years earlier, in October of 1969, I had begun working on a lawsuit that had a significant bearing on women’s rights because it would challenge the constitutionality of Illinois’s restrictive abortion law. This law had an enormous impact on the lives of women, especially poor and non-white women.

I worked with Sybille Fritzsche, a lawyer with the ACLU in Chicago, who became my close friend.  Sybille and I spent months preparing our case.  We filed our lawsuit in February 1970, argued it before a three-judge federal court in September, and won a 2-to-1 ruling in our favor in January 1971.  (The ruling in that case, Doe v. Scott, and the events leading up to it, are the focus of a book I’m currently writing.  In the meantime, you can read about our case in historian Leslie Reagan’s prize-winning book, When Abortion Was a Crime.)

Now, in the fall of 1972, I learned about the conference at NYU.  Because I was extremely interested in attending it, I decided to ask the UM law school’s dean, Theodore St. Antoine, whether the school might send me to New York to attend it.  I thought I had a pretty persuasive argument:  I was the only full-time woman on the law school faculty.  Didn’t the dean think it would be a good idea to send me to represent UM at the conference? 

How could he say “no”?  Ted thought about for a moment, then gave his approval.  So off I went, my expenses paid by the kind patrons of UM. 

My hotel, the Fifth Avenue Hotel, located near NYU’s law school, had sounded appealing on paper, but it turned out to be something of a dump.  It suited me just fine, however, because I barely spent any time there.  I was too busy attending the conference sessions and, when I could, taking a short break to reconnect with a couple of law-school classmates and briefly sample life in New York City, a city light-years removed from less-than-exhilarating Ann Arbor, Michigan.

The conference, held on October 20-21, turned out to be a symposium sponsored by AALS (the American Association of Law Schools), “The AALS Symposium on the Law School Curriculum and the Legal Rights of Women.”  It featured a number of prominent speakers, mostly law professors and practicing lawyers who had turned their attention to “the legal rights of women” in areas like tax law, property law, and criminal law.  I attended most of these sessions, and each of them was excellent.

But the only session I was really excited about was a talk by someone named Ruth Bader Ginsburg.  I was quite certain that I would relish hearing her talk, “Toward Elimination of Sex-Based Discrimination: Constitutional Aspects,” because the topic was right down my alley.

Looking back, I don’t think I knew anything about RBG at the time.  But when she was introduced (by NYU dean Robert McKay) and began to speak, I was riveted by every word she uttered.  She spelled out everything she had already done and planned to do to achieve gender-equity.

So although I was not already familiar with her, I knew immediately that she clearly was and would continue to be a brilliant leader in the field of women’s rights.  I filed her name away in my memory so I could follow whatever she would do in the coming years.  And I did just that, enthusiastically following the many astounding accomplishments she achieved after 1972.

Your image of RBG may be that of the frail, petite woman who took center stage in our culture in her 80s.  But the RBG I saw in 1972 was very different.  She was an amazingly attractive young woman of 39.  You can see photos of her at that time in The New York Times of September 18 (in Linda Greenhouse’s long review of her life and career) and in a recent issue of TIME magazine (Oct. 5-12, 2020). Although much has been made of her short stature (one I share), she was so very energetic and focused that one quickly forgot how small she was.

It turned out that she had attended Harvard Law School about a decade before I did.  Like her, I’ve been called a “trailblazer” and a “pioneer,” and I also confronted gender-bias at every turn throughout my life.  My path was only a bit less rocky than hers:  My class at HLS included the whopping number of 25 women in a class of 520, while hers had only 9.

I’ve since learned that October 1972 marked a pivotal time in RBG’s career.  She had just switched her teaching position from Rutgers Law School to Columbia Law School (a considerable upgrade).  And she had just assumed another new position:  Director of the Women’s Rights Project at the ACLU, a project she had helped to found a short time before. 

So I’m left wondering…did she know about the case Sybille (an ACLU attorney in Chicago) and I brought in February 1970, a case that put a woman’s right to reproductive choice front and center?

RBG was an ardent supporter of reproductive rights during her tenure on the Supreme Court.  She discussed her views on abortion and gender equality in a 2009 New York Times interview, where she said “[t]he basic thing is that the government has no business making that choice for a woman.”

But I know that she had also stated that she wasn’t entirely happy with the way in which Roe v. Wade gave every woman in the U.S. that choice by bringing cases like Doe v. Scott in the federal courts.  She stated that she would have preferred that the argument had been made, over time, in each state’s legislature, with the right to choose being gradually adopted in that way rather than in one overriding court ruling that included every state.

Notably, on the 40th anniversary of the court’s ruling in Roe v. Wade, she criticized the decision because it terminated “a nascent democratic movement to liberalize abortion laws” that might have built “a more durable consensus” in support of abortion rights.

She had a point.  A democratic movement to liberalize abortion laws would have been the ideal route, and might have been a less contentious route, to achieving abortion rights throughout the country. 

But I think her position was influenced by her own life story. 

It stemmed, at least in part, from the fact that in April 1970, she was living and working in New York, where the state legislature had passed a new law allowing abortion, and New York Governor Nelson Rockefeller had signed it on April 11, 1970.  New York became only the second state in the U.S. (after Hawaii) to permit abortion, and only a few other states had carved out any sort of exception to what was otherwise a nationwide ban on abortion.

RBG may have optimistically believed that other states would follow New York’s lead.  But history has proved otherwise.

If women had waited for each of the 50 states to accomplish the goal of women’s reproductive choice, I think we’d still have many states refusing to enact laws allowing choice.  In support of my view, I ask readers to consider the situation today, when some states are trying to restrict abortion so frenetically, with or without achieving a complete ban, that they’re now simply waiting for a far-right conservative Court to overturn Roe v. Wade.

Whether or not RBG was aware of what was happening in the courtrooms of Chicago in 1970, I think I could have persuaded her that Sybille and I were doing the right thing.  

By advocating that the federal district court hold that the restrictive Illinois abortion law was unconstitutional, and persuading the court to decide in our favor, we achieved our goal of saving the lives and health of countless women who would have otherwise suffered from their inability to obtain a legal and medically safe abortion.

What greater achievement on behalf of women’s rights could there have been? 

I like to think that, after hearing my argument, RBG would have approved.

Is It Time to Resurrect the “Housedress”?

The HBO miniseries, “The Plot Against America,” which appeared earlier this year, focused on life in America in the early 1940s.  Adapted from the 2005 novel by Philip Roth, the storyline was terrifying, highlighting the possibility that a fascist anti-Semitic regime could assume control over politics in our country.

New York Times critic A.O. Scott, describing HBO’s adaptation as “mostly faithful” to the novel, observed that the world it portrayed looked familiar, yet different, to us today.  He noted in particular “the clothes” worn by the people inhabiting that world, as well as the cars, the cigarettes, and what he called “the household arrangements,” evoking a period “encrusted with…nostalgia.”

The series was, in my view, a stunning depiction of that era, along with a chilling prediction of what might have happened.  Thankfully, Roth’s fictional prediction never came true, and I hope it never will.

One thing I took away from the series was how authentically it created the images from that time.  I was born years later than both Philip Roth and his character, the 8-year-old Philip.  But I can recall images from the 1950s, and I’ve seen countless films dating from the 1940s and 1950s, as well as TV shows like “I Love Lucy.”

A couple of things in the series stand out.  First, people got their news from newspapers and the radio.  The leading characters appear in a number of scenes reading the daily newspapers that influenced their view of the world.  They also listened attentively to the radio for news and other information.  The radio broadcaster Walter Winchell even plays an important part in the story.

The other thing that stands out is the clothing worn by the characters in “Plot.”  Especially the women characters.  These women tended to have two types of wardrobes.  One represented the clothing they wore at home, where they generally focused on housecleaning, cooking, and tending to their children.  The other represented what they would wear when they left home, entering the outside world for a variety of reasons.

The wardrobe worn at home looked extremely familiar.  My mother clung to that wardrobe for decades.  She, like the women in “Plot,” wore housedresses at home.  These were cotton dresses, usually in a floral or other subdued print, that were either buttoned or wrapped around the body in some fashion.  In an era before pants became acceptable for women (Katharine Hepburn being a notable exception), women wore dresses or skirts, even to do housework at home.

Only when they left home, to go to somewhere like an office or a bank, did they garb themselves in other clothes.  In this wardrobe, they tended to wear stylish dresses made with non-cotton fabrics, or skirt suits with blouses, along with hats and white gloves. Working women employed in office-type settings (there were a few, like the character brilliantly played by Winona Ryder in “Plot”) wore these clothes to work every day. (Women employed in other settings of course wore clothes appropriate to their workplaces.)

Now, with most of us staying home for the most part, I wonder:  Is it time to resurrect the housedress?

Here are some reasons why it might be:

  1. Warmer weather is approaching, or may have already arrived, depending on where you live.
  2. Relying on heavy clothing like sweatshirts and sweatpants, which many of us have been relying on during our self-isolation at home, will become impractical because that clothing will be uncomfortably hot.
  3. Pajamas and nightgowns aren’t a good idea for all-day wear.  We should save them for bedtime, when we need to separate our daytime experience from the need to get some sleep.
  4. The housedress offers an inviting choice for women who want to stay comfortably at home, wearing cool cotton (or cotton-blend) dresses that allow them to move as comfortably as they do in sweat clothes, all day long.

I concede that comfortable shorts and t-shirts might fit the bill, for men as well as women.  But I suggest that women consider an alternative.  They may want to give housedresses a try.

Ideally, a woman will be able to choose from a wide range of cheerful fabric designs and colors.  If she can track down one that appeals to her, she just might be convinced by its comfort and then tempted to wear more of them.

I’ve already adopted my own version of the housedress.  I rummaged through one of my closets and found a few items I haven’t worn in years.  I’ve always called them “robes,” although they’ve also been called housecoats or other names.  My mother for some reason liked to call them “dusters.”  My husband’s aunt liked to wear what she called “snap coats.”

But in the big picture, we’re really talking about the same thing.  Cotton robes/dresses in a variety of designs and prints. Today they’re usually fastened with snaps.  Easy in, easy out.

And most of them have pockets!  (As I’ve written before, all women’s clothes should have pockets.)  [Please see my blog post “Pockets!” https://susanjustwrites.wordpress.com/2018/01/ ]

I plucked a couple of these out of my closet, some with the brand name Models Coats.  I had never even worn one of them.  (A tag was still attached, featuring the silly slogan, “If it’s not Models Coat…it’s not!”)  But I’ll wear it now.

By the way, I’ve checked “Models Coats” on the internet, and an amazing variety of “housedresses,” or whatever you choose to call them—Models Coats and other brands–is offered online.  So it appears that some women have been purchasing them all along.

Now here’s a bit of cultural history:  My mother kept her 1950s-style housedresses well into the 1990s.  I know that because I discovered them in her closet when we visited her Chicago apartment one cold winter day in the ‘90s.  Mom lived in a 1920s-era apartment building, filled with radiators that ensured overheated air in her apartment.  [Please see my blog post “Coal:  A Personal History,” discussing the overheated air that coal-based radiators chugged out:  https://susanjustwrites.wordpress.com/2020/01/29/coal-a-personal-history/ ]

My daughters and I had worn clothing appropriate for a cold winter day in Chicago.  But as we sat in Mom’s overheated living room, we began to peel off our sweaters and other warm duds.  (My husband didn’t do any peeling.  He was too smart to have dressed as warmly as we had.)

It finally occurred to me that Mom might have saved her housedresses from long ago.  Maybe she even continued to wear them.  So I searched her closet and found three of them.  My daughters and I promptly changed, and we immediately felt much better.  But when we caught sight of ourselves, we laughed ourselves silly.  We looked a lot like the model in a Wendy’s TV commercial we called “Russian fashion show.”

In our favorite Wendy’s commercial, dating from 1990, Russian music plays in the background while a hefty woman dressed in a military uniform announces the fashion show in a heavy Russian accent.  The “model” comes down the runway wearing “day wear,” “evening wear,” and “beachwear.”  What’s hilariously funny is that she wears the same drab dress, along with a matching babushka, in each setting.  For “evening wear,” the only change is that she waves a flashlight around.  And for “beachwear,” she’s clutching a beach ball.

Wendy’s used clever commercials like this one to promote their slogan:  “Having no choice is no fun,” clearly implying that Wendy’s offered choices its fast-food competitors didn’t.  I don’t know whether these commercials helped Wendy’s bottom line, but they certainly afforded our family many, many laughs.

[If you need some laughs right now, you can find these commercials on YouTube.  Just enter words like “Wendy’s TV commercials” and “Russian fashion show.”]

Mom’s housedresses weren’t as drab as the dress worn by the model in our favorite commercial.   They tended to feature brightly colored prints.  Admittedly, they weren’t examples of trend-setting fashion.  But they certainly were cool and comfortable

In our current crisis, we need to be creative and come up with new solutions to new problems.  For those women seeking something comfortable to wear, something different from what they’ve been wearing, colorful housedresses just might be the right choice.

Hooray for Hollywood! Part I

As a lifelong film buff (OK, since I was about 4), I have great fondness for much that Hollywood (and foreign cinema) has produced.  Each year I try to see a number of new films and re-watch some of the old ones.

During the past year, I never got around to seeing most of the blockbusters that dominated the box office. According to the online publication The Verge, Disney produced an unprecedented 80 percent of the top box-office hits in 2019.

Thanks to its purchase during the last decade of Marvel Entertainment (2009) and Lucasfilm (2012), Disney films have included franchises like Star Wars and the Marvel hits, in addition to popular animated films like Frozen and Frozen 2.  The result:  Disney films have surpassed many other films at the box office.

But I don’t pay a lot of attention to box-office success.  I’m far more focused on seeing films that have something to say to me. This year my clear favorite was Once Upon a Time…in Hollywood.

Once Upon a Time, a Quentin Tarantino film, is not only a fabulous depiction of Hollywood in 1969, but it also related to me and my life in a number of ways.

Spoiler alert:  If you haven’t yet seen this film, DO NOT read the ending of this blog post, where I write about the Manson murders.

First, about the film itself:  It’s been called a “buddy picture,” and in many ways it is.  In two stellar performances, Leonardo DiCaprio (playing the fictional Rick Dalton) and Brad Pitt (playing fictional Cliff Booth), are indeed buddies.  Rick is a fading former star of a Western TV series, trying to make a comeback in Hollywood, while Cliff is his longtime stunt double.  By 1969, with Rick’s star on the wane, Cliff spends much of his time driving Rick from place to place.  Both are struggling to survive in a Hollywood that has changed from the one they knew.

Weaving fiction and fact throughout the film, Tarantino uses both humor and violence to depict the end of an era.  In this love letter to 1960s Hollywood (which has earned positive reviews by most top critics on Rotten Tomatoes and garnered numerous awards and nominations), he embeds specifics of popular culture and real places in 1969 LA into the film.

 

The story takes place during two days in February and one day in August of 1969.  Notably, Rick Dalton’s home is right next door to the home of minor film star Sharon Tate (married to director Roman Polanski) in a posh section of western LA, Benedict Canyon.

In this film, Tarantino also skillfully blends in the ugly story of the Charles Manson “family.”

Re-creating in many ways the world that I lived in at about the same time, even if he himself did not, Tarantino provoked a cascade of intensely vivid memories for me.  Here’s why:

 

 

I left Chicago in August 1970 and moved to the Westwood neighborhood on the west side of LA, where I rented a cheerful furnished apartment within walking distance of UCLA.

I had moved my “Reggie Fellowship” from the Appellate and Test Case Division of the Chicago Legal Aid Bureau to a health-law related Legal Services office that was located at UCLA Law School.  Reggies were predominantly young lawyers who opted to work on behalf of the poor rather than toil in a corporate law firm.  (Please see my more detailed description of the Reggie program in an earlier post, “The Summer of ’69,” published on August 7. 2015.)

Westwood and Westwood Village (the commercial area in Westwood, adjacent to UCLA), loom large in my memory.  I met my husband-to-be (I’ll call him Marv) on the UCLA campus in October 1970, six weeks after I arrived.  Before we met, we had both rented separate apartments in the same apartment building located on the fringe of the campus. We soon began dating, and my memory bank is filled with countless memories related to our courtship and marriage that year.

My new location was very close to much of what happens in the Tarantino film only one year earlier.  So when he replicates things from that time, I recall seeing and hearing a lot of what looked like them myself.

Examples:  Street signs, ads painted on bus-stop benches, movie posters, commercials, and music. (Some of these are Tarantino’s own inventions.)

Probably the best example:  Sharon Tate goes to see herself in a film at a movie theater in Westwood Village.  During the year that I lived in Westwood, I saw many films at the movie theaters in Westwood Village.  (Seeing “Love Story” with Marv in one of them in December 1970 was especially memorable, and I plan to write about it in a future blog post.)

Another example:  A scene in the movie is set at the famous LA restaurant called Musso & Frank Grill.  Marv and I were both aware of its fame, and during that year we sought it out and dined there one special night.

One more thing:  The stunning area where Sharon Tate and Roman Polanski lived next door to the fictional Rick Dalton (Benedict Canyon) is in western LA, not far from Westwood and very close to BelAir.  Marv and I not only lived in Westwood, but we also celebrated our wedding luncheon at the charming BelAir Hotel.

Then there’s the Manson family storyline in the movie.  I learned about the Manson murders during a weekend in New York City.  I was spending part of the summer of 1969 at the Reggie training program at Haverford College, near Philadelphia, and I traveled from Philly to NYC one weekend in August

During trips to NYC, I often stayed with a close friend and a law-school classmate (I’ll call her Arlene).  Although Arlene was planning to be out of town that weekend, she invited me to stay in her 86th Street apartment on the East Side of Manhattan without her.  It was a great opportunity to live by myself as a quasi-New Yorker, and I decided to do it.

Returning to her apartment on Saturday evening, I picked up the Sunday New York Times and was shocked by a headline spelling out the startling discovery of the Manson murders.

At that time, I was still living in Chicago, but I had briefly lived in LA when I was 12 and always liked to follow any news arising there.  So I was riveted by the Manson story and read the paper from cover to cover.

When Tarantino decided to weave this story into the rest of his film, he did what he’d done in Inglourious Basterds and changed the real ending to a much different one.

Watching Once Upon a Time, I was terribly nervous as the film approached its ending.  I knew how the real story turned out, and I didn’t know exactly how this film would portray it.  But what a departure from reality Tarantino created!  The shocking ending to the film includes imaginative violence that is so over-the-top that it’s almost humorous.  Overall, the ending is a clever re-imagining of the fate of the Manson family and a much happier resolution of what happened to their victims.

Although the new ending was violent in its own way, creating an exciting piece of filmmaking, I left the theater in a much sunnier frame of mind than I would have if Tarantino had re-created the actual massacre that took place in 1969.

 

In sum, Once Upon a Time is, to my mind, an absorbing and a fascinating film.  For me, it was one of the best films of 2019.

 

I plan to write again about Hollywood films that have been relevant to my own life.  Part II will begin to explore classic films that have done just that.

 

 

My Life as a Shopper

I have a new outlook on shopping.  I’m no longer shopping the way I used to.

Why?

I’ll start at the beginning.  My long history of shopping began when I was very young.

My parents were both immersed in retailing.  My mother’s parents immigrated to Chicago from Eastern Europe and, soon after arriving, opened a clothing store on Milwaukee Avenue.  Their enterprise evolved into a modest chain of women’s apparel stores, and throughout her life my mother was intimately involved in the business.  She embedded in me the ethos that shopping for new things, especially clothes, was a good thing.  Under her influence, I gave away countless wearable items of clothing in favor of getting something new, preferably something sold in one of her family’s stores.  (I later regretted departing with some of the perfectly good items I could have continued to wear for many more years.)

Even though my father received a degree in pharmacy from the University of Illinois, and he enjoyed some aspects of his work as a pharmacist, he was himself attracted to retailing.  At a young age, he opened his own drugstore on the South Side of Chicago (I treasure a black-and-white photo of him standing in front of his store’s window).  After marrying my mother, he spent a number of years working in her family’s business, and in the late ‘40s the two of them opened a women’s clothing boutique on Rush Street, a short distance from Oak Street, in a soon-to-be-trendy shopping area.  Ahead of its time, the boutique quickly folded, but Daddy never lost his taste for retailing.

In view of this history, I was fated to become a “shopper.”  After Daddy died when I was 12, our family wasn’t able to spend big wads of money on anything, including clothes.  But my mother’s inclination to buy new clothes never really ceased.

Thanks to generous scholarship and fellowship awards, I made my way through college and grad school on a miniscule budget.  I saved money by spending almost nothing, savoring the 99-cent dinner at Harkness Commons almost every night during law school to save money.  And because I began my legal career with a $6,000 annual salary as a federal judge’s law clerk and, as a lawyer, never pursued a high-paying job (I preferred to work on behalf of the poor, for example), I got by without big-time shopping.

Marriage brought little change at first.  My darling new husband also came from a modest background and was not a big spender, even when our salaries began to move up a bit.

But things eventually changed.  Higher salaries and the arrival of new retail chain stores featuring bargain prices made buying stuff much more tempting.  I needed presentable clothes for my new full-time jobs.  Our daughters needed to be garbed in clothes like those the other kids wore.  Our living room chairs from Sears began to look shabby, propelling us toward somewhat better home décor.

A raft of other changes led me to spend more time shopping.  My boring law-firm jobs were more tolerable if I could escape during my lunch hour and browse at nearby stores.  The rise of outlet malls made bargain shopping easier than ever.  And travels to new cities and countries inspired buying small, easily packable items, like books and jewelry.

After I moved to San Francisco, having jettisoned possessions I’d lived with for years in my former home, I needed to acquire new ones.  So there I was, buying furniture and kitchen equipment for my sunny new apartment.

At the same time, our consumption-driven culture continued to push buying more and more, including the “fast-fashion” that emerged, offering stylish clothes at a temptingly low price.

But this emphasis on acquiring new stuff, even low-priced stuff, has finally lost its appeal.

I’ve come to realize that I don’t need it.

My overall goal is to simplify my life.  This means giving away a lot of things I don’t need, like stacks of books I’ll never read and charming bric-a-brac that’s sitting on a shelf collecting dust.  Like clothes that a disadvantaged person needs more than I do.

My new focus:  First, use what I already have.  Next, do not buy anything new unless I absolutely need it.

Choosing not to acquire new clothes—in essence, reusing what I already have, adopting the slogan “shop your closet”–is a perfect example of my new outlook.

I’ve previously written about confining one’s new purchases to “reunion-worthy” clothes.  [Please see my blog post of October 12, 2017, advising readers to choose their purchases carefully, making sure that any clothes they buy are flattering enough to wear at a school reunion.]

But that doesn’t go far enough.  New purchases should be necessary.

I find that I’m not alone in adopting this approach.

Many millennials have eschewed buying consumer goods, opting for new experiences instead of new material things.  I guess I agree with the millennials’ outlook on this subject.

Here’s other evidence of this approach.  An article in The Guardian in July 2019 shouted “’Don’t feed the monster!’ The people who have stopped buying new clothes.”  Writer Paula Cocozza noted the growing number of people who love clothes but resist buying new ones because of the lack of their sustainability:  Many consumers she interviewed were switching to second-hand shopping so they would not perpetuate this consumption and waste.

Second-hand shopping has even taken off online.  In September, the San Francisco Chronicle noted the “wave of new resale apps and marketplaces” adding to longtime resale giants like eBay.  At the same time, The New York Times, covering Fashion Week in Milan, wrote that there was “a lot of talk about sustainability over the last two weeks of collections, and about fashion’s role in the climate crisis.”  The Times added:  “the idea of creating clothes that last—that people want to buy and actually keep, keep wearing and never throw out, recycle or resell”—had become an important part of that subject.  It quoted Miuccia Prada, doyenne of the high-end clothing firm Prada:  “we need to do less.  There is too much fashion, too much clothes, too much of everything.”

Enter Tatiana Schlossberg and her new book, Inconspicuous consumption:  the environmental impact you don’t know you have (2019).  In the middle of an absorbing chapter titled Fashion, she notes that “There’s something appealing about being able to buy really cheap, fashionable clothing [..,] but it has given us a false sense of inexpensiveness.  It’s not only that the clothes are cheap; it’s that no one is paying for the long-term costs of the waste we create just from buying as much as we can afford….”

Some scholars have specifically focused on this issue, the “overabundance of fast fashion—readily available, inexpensively made new clothing,” because it has created “an environmental and social justice crisis.”  Christine Ekenga, an assistant professor at Washington University in St. Louis, has co-authored a paper focused on the “global environmental injustice of fast fashion,” asserting that the fast-fashion supply chain has created a dilemma.  While consumers can buy more clothes for less, those who work in or live near textile-manufacturing bear a disproportionate burden of environmental health hazards.  Further, millions of tons of textile waste sit in landfills and other settings, hurting low-income countries that produce many of these clothes.  In the U.S., about 85 percent of the clothing Americans consume–nearly 80 pounds per American per year–is sent to landfills as solid waste.  [See “The Global Environmental Injustice of Fast Fashion” in the journal Environmental Health.]

A high-profile public figure had an epiphany along the same lines that should influence all of us.  The late Doug Tompkins was one of the founders of The North Face and later moved on to help establish the apparel chain Esprit.  At the height of Esprit’s success, he sold his stake in the company for about $150 million and moved to Chile, where he embraced a whole new outlook on life and adopted an important new emphasis on ecology.  He bought up properties for conservation purposes, in this way “paying my rent for living on the planet.”  Most tellingly, he said, “I left that world of making stuff that nobody really needed because I realized that all of this needless overconsumption is one of the driving forces of the [environmental] crisis, the mother of all crises.”  [Sierra magazine, September/October 2019.]

Author Marie Kondo fits in here.  She has earned fame as a de-cluttering expert, helping people who feel overwhelmed with too much stuff to tidy up their homes.  Her focus is on reducing clutter that’s already there, so she doesn’t zero in on new purchases.  But I applaud her overall outlook.  As part of de-cluttering, she advises:  As you consider keeping or letting go of an item, hold it in your hands and ask:  “Does this item bring me joy?”  This concept of ensuring that an item brings you joy could apply to new purchases as well, so long as the item bringing you joy is also one you really need.

What should those of us enmeshed in our consumer culture do?  In The Wall Street Journal in July 2019, April Lane Benson, a “shopping-addiction-focused psychologist and the author of ‘To Buy or Not to Buy:  Why We Overshop and How to Stop’,” suggested that if a consumer is contemplating a purchase, she should ask herself six simple questions:  “Why am I here? How do I feel? Do I need this? What if I wait? How will I pay for it? Where will I put it?”

Benson’s list of questions is a good one.  Answering them could go a long way toward helping someone avoid making a compulsive purchase.  But let’s remember:  Benson is talking about a shopper already in a store, considering whether to buy something she’s already selected in her search for something new.  How many shoppers will interrupt a shopping trip like that to answer Benson’s questions?

I suggest a much more ambitious scheme:  Simply resolve not to buy anything you don’t need!

My 11-year-old granddaughter has the right idea:  She’s a minimalist who has rejected any number of gifts from me, including some fetching new clothes, telling me she doesn’t need them.

When I reflect on my life as a shopper, I now understand why and how I became the shopper I did.  Perhaps, in light of my family history and the increasingly consumption-driven culture I’ve lived through, I didn’t really have an option.

But I have regrets:  I’ve wasted countless hours browsing in stores, looking through racks and poring over shelves for things to buy, much of which I didn’t need, then spending additional hours returning some of the things I had just purchased.

These are hours I could have spent far more wisely.  Pursuing my creative work, exercising more often and more vigorously, doing more to help those in need.

Readers:  Please don’t make the mistakes I have.  Adopt my new philosophy.  You’ll have many more hours in your life to pursue far more rewarding goals than acquiring consumer goods you don’t really need.

 

 

 

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?