Category Archives: San Francisco Chronicle

Thanksgiving 2021

Thanksgiving 2021 has come and gone.  But let’s reflect on it for a moment.

As we celebrated the holiday this year, our country was facing a number of serious problems:  climate change, political divisions, the continuing coronavirus pandemic.  But we’ve had reason to be thankful for some positive changes as well.

Among the positive changes we can point to is the long-overdue recognition of the rights of indigenous peoples, like those who were at the “first Thanksgiving.”  Unlike the traditional and untrue telling of the story of that event—a story that’s still perpetuated in at least some of the schools our children attend—the people who were already here (commonly called American Indians or Native Americans) did not view the Pilgrims’ celebratory feast as a happy one.

Even then, at the very beginning of our country’s history, the Indian people who were confronted with Europeans arriving on their shores viewed them not as welcome guests but as a threat. 

If that was indeed the judgment of their leaders, they were right.  The new settlers were oppressors who drove the native peoples off their land—in the words of U.S. Secretary of the Interior Deb Haaland, these “ancestors…who stewarded our lands since time immemorial.”

Secretary Haaland, the first Native American appointed to a major cabinet post by a U.S. President and a former member of the U.S. Congress, spoke at a ceremony on November 19th, marking the 52nd anniversary of the occupation of Alcatraz Island by indigenous people in 1969.  During her remarks, she announced that she had established a process to review and replace derogatory names currently attached to our nation’s geography.

Specifically, Secretary Haaland ordered the federal board tasked with naming geographic places, the Board on Geographic Names, to remove the term “squaw” from federal usage.  The Board, established in 1890, has in the past identified derogatory terms on a case-by-case basis, but more extensive replacements have also occurred.  In 1962, Secretary Steward Udall identified the N-word as derogatory and directed the Board to eliminate its use.  In 1974, the Board similarly identified a pejorative term for “Japanese” as derogatory and eliminated its use.

Most Americans may be unaware that the term “squaw” is a derogatory term used for many years to demean women, especially Native women.  But Haaland was outspoken in condemning it.  She said, “Racist terms have no place in our vernacular or on our federal lands.  Our nation’s lands and waters should be places to celebrate the outdoors and our shared cultural heritage—not to perpetuate the legacies of oppression.”

Several states have already passed legislation prohibiting the use of this term in place names, including Montana, Oregon, Maine, Oklahoma, South Dakota, and Minnesota.  Legislation is currently pending in both chambers of Congress to address derogatory names on public land.

The new order to eliminate this woman-demeaning term presents a significant problem in California.  The San Francisco Chronicle reported on November 24th that an estimated 100-plus places in California carry the derogatory name.  These include peaks, streams, trails, and other geographic features.  According to the ACLU, there may be as many as 113 sites in California using this term.  Looming large are two small towns in Northern California called Squaw Valley, one in North Lake Tahoe, the other in Fresno County.

The Chronicle reported a statement by Roman Rain Tree, a member of a band of Native tribes indigenous to the Fresno County area, who has been organizing a grassroots effort to rename the rural town of Squaw Valley.  Secretary Haaland, he said, has made “a giant leap forward.  It restores my belief that the government has elected officials who will look after our community.”

The Chronicle also reported that the California State Parks have identified a number of geographic features carrying the name and intend to rename them, moving us “closer to the goal of reckoning with our past, making space for healing and promoting equity.”  Removing the term is seen as a priority.

More troublesome is renaming the towns called Squaw Valley.  According to the Chronicle, thousands of people have already signed an online petition to change the name of the town in Fresno County.  But some residents of the community have “balked at the idea, contending that ‘squaw’ isn’t universally offensive.”  A county supervisor said that “Squaw Valley is offensive to some, but not all.  … [T]he local community needs to be involved in that conversation.”

Meanwhile, the Tahoe ski resort, long named Squaw Valley, has already changed its name to Palisades Tahoe.  Now it apparently needs to do a better job of publicizing its new name.  A short time ago, I heard an ABC weather reporter still refer to it on national television as “Squaw Valley.”

The San Francisco Examiner also reviewed some of these issues on November 25th, writing about a ceremony to be held at Alcatraz Island on what most of us viewed as Thanksgiving Day but others viewed as “a day of mourning for Indigenous people, also known as “Unthanksgiving Day.’” This ceremony first took place in 1975, six years after indigenous activists occupied the island to claim it as a place promised to them in a treaty that was later broken by the federal government.  April McGill, executive director of the American Indian Cultural Center, told the Examiner that she hoped “people think about what the holiday really means and rethink it…[not] to do away with the holiday altogether but to remove the celebration of Thanksgiving, instead [to think of it as showing] gratitude for the fall harvest.”

At the same time, California is just beginning to reckon with its long and ugly history regarding the treatment of American Indians.  An essay by John Briscoe, published in the Chronicle on November 28th, outlines this history, noting that while California was admitted to the union in 1850 as a “free state,” it was, in truth, “conceived in genocide” of its Native Americans.  A long-established principle of law required the U.S. to honor the private property rights of indigenous peoples.  Instead, the state of California openly sponsored the “theft” of land belonging to the local tribes that lived here.  Indians were also subject to the state’s Indian Slavery Act (enacted despite being in violation of the state’s constitution) until it was repealed in 1937.

Serranus Hastings, California’s first chief justice, profited off the enslavement of Indians, and the law school in San Francisco that bears his name is now in the process of renaming itself.   Briscoe writes that Hastings, Leland Stanford, and many others acquired vast tracts of land through violence against Indians and made fortunes in real estate as a result.  “California Indians had rights guaranteed by law—American domestic law and international law—[including] the right not to be murdered, not to be enslaved, not to be stripped at gun and knife point of their ancestral lands.”  But, he says, each of these rights “was systematically and repeated violated by the state of California.” 

In 2019, there was belated acknowledgment of these wrongs.  Governor Gavin Newsom officially apologized “on behalf of the citizens of the state of California to all California Native Americans for the many instances of violence, maltreatment and neglect California inflicted on tribes.”  Newsom also created a Truth and Healings Council to clarify the historical record.

Although we should never forget past inequities, which have occurred throughout our country and its long history, we should also acknowledge the positive changes that have taken place in recent years.  With Native American Deb Haaland as our new Secretary of the Interior, the U.S. may finally be moving towards equity for our indigenous peoples.

I, for one, am happy to know that some of these changes have happened in time for Thanksgiving 2021.

My Life as a Shopper

I have a new outlook on shopping.  I’m no longer shopping the way I used to.

Why?

I’ll start at the beginning.  My long history of shopping began when I was very young.

My parents were both immersed in retailing.  My mother’s parents immigrated to Chicago from Eastern Europe and, soon after arriving, opened a clothing store on Milwaukee Avenue.  Their enterprise evolved into a modest chain of women’s apparel stores, and throughout her life my mother was intimately involved in the business.  She embedded in me the ethos that shopping for new things, especially clothes, was a good thing.  Under her influence, I gave away countless wearable items of clothing in favor of getting something new, preferably something sold in one of her family’s stores.  (I later regretted departing with some of the perfectly good items I could have continued to wear for many more years.)

Even though my father received a degree in pharmacy from the University of Illinois, and he enjoyed some aspects of his work as a pharmacist, he was himself attracted to retailing.  At a young age, he opened his own drugstore on the South Side of Chicago (I treasure a black-and-white photo of him standing in front of his store’s window).  After marrying my mother, he spent a number of years working in her family’s business, and in the late ‘40s the two of them opened a women’s clothing boutique on Rush Street, a short distance from Oak Street, in a soon-to-be-trendy shopping area.  Ahead of its time, the boutique quickly folded, but Daddy never lost his taste for retailing.

In view of this history, I was fated to become a “shopper.”  After Daddy died when I was 12, our family wasn’t able to spend big wads of money on anything, including clothes.  But my mother’s inclination to buy new clothes never really ceased.

Thanks to generous scholarship and fellowship awards, I made my way through college and grad school on a miniscule budget.  I saved money by spending almost nothing, savoring the 99-cent dinner at Harkness Commons almost every night during law school to save money.  And because I began my legal career with a $6,000 annual salary as a federal judge’s law clerk and, as a lawyer, never pursued a high-paying job (I preferred to work on behalf of the poor, for example), I got by without big-time shopping.

Marriage brought little change at first.  My darling new husband also came from a modest background and was not a big spender, even when our salaries began to move up a bit.

But things eventually changed.  Higher salaries and the arrival of new retail chain stores featuring bargain prices made buying stuff much more tempting.  I needed presentable clothes for my new full-time jobs.  Our daughters needed to be garbed in clothes like those the other kids wore.  Our living room chairs from Sears began to look shabby, propelling us toward somewhat better home décor.

A raft of other changes led me to spend more time shopping.  My boring law-firm jobs were more tolerable if I could escape during my lunch hour and browse at nearby stores.  The rise of outlet malls made bargain shopping easier than ever.  And travels to new cities and countries inspired buying small, easily packable items, like books and jewelry.

After I moved to San Francisco, having jettisoned possessions I’d lived with for years in my former home, I needed to acquire new ones.  So there I was, buying furniture and kitchen equipment for my sunny new apartment.

At the same time, our consumption-driven culture continued to push buying more and more, including the “fast-fashion” that emerged, offering stylish clothes at a temptingly low price.

But this emphasis on acquiring new stuff, even low-priced stuff, has finally lost its appeal.

I’ve come to realize that I don’t need it.

My overall goal is to simplify my life.  This means giving away a lot of things I don’t need, like stacks of books I’ll never read and charming bric-a-brac that’s sitting on a shelf collecting dust.  Like clothes that a disadvantaged person needs more than I do.

My new focus:  First, use what I already have.  Next, do not buy anything new unless I absolutely need it.

Choosing not to acquire new clothes—in essence, reusing what I already have, adopting the slogan “shop your closet”–is a perfect example of my new outlook.

I’ve previously written about confining one’s new purchases to “reunion-worthy” clothes.  [Please see my blog post of October 12, 2017, advising readers to choose their purchases carefully, making sure that any clothes they buy are flattering enough to wear at a school reunion.]

But that doesn’t go far enough.  New purchases should be necessary.

I find that I’m not alone in adopting this approach.

Many millennials have eschewed buying consumer goods, opting for new experiences instead of new material things.  I guess I agree with the millennials’ outlook on this subject.

Here’s other evidence of this approach.  An article in The Guardian in July 2019 shouted “’Don’t feed the monster!’ The people who have stopped buying new clothes.”  Writer Paula Cocozza noted the growing number of people who love clothes but resist buying new ones because of the lack of their sustainability:  Many consumers she interviewed were switching to second-hand shopping so they would not perpetuate this consumption and waste.

Second-hand shopping has even taken off online.  In September, the San Francisco Chronicle noted the “wave of new resale apps and marketplaces” adding to longtime resale giants like eBay.  At the same time, The New York Times, covering Fashion Week in Milan, wrote that there was “a lot of talk about sustainability over the last two weeks of collections, and about fashion’s role in the climate crisis.”  The Times added:  “the idea of creating clothes that last—that people want to buy and actually keep, keep wearing and never throw out, recycle or resell”—had become an important part of that subject.  It quoted Miuccia Prada, doyenne of the high-end clothing firm Prada:  “we need to do less.  There is too much fashion, too much clothes, too much of everything.”

Enter Tatiana Schlossberg and her new book, Inconspicuous consumption:  the environmental impact you don’t know you have (2019).  In the middle of an absorbing chapter titled Fashion, she notes that “There’s something appealing about being able to buy really cheap, fashionable clothing [..,] but it has given us a false sense of inexpensiveness.  It’s not only that the clothes are cheap; it’s that no one is paying for the long-term costs of the waste we create just from buying as much as we can afford….”

Some scholars have specifically focused on this issue, the “overabundance of fast fashion—readily available, inexpensively made new clothing,” because it has created “an environmental and social justice crisis.”  Christine Ekenga, an assistant professor at Washington University in St. Louis, has co-authored a paper focused on the “global environmental injustice of fast fashion,” asserting that the fast-fashion supply chain has created a dilemma.  While consumers can buy more clothes for less, those who work in or live near textile-manufacturing bear a disproportionate burden of environmental health hazards.  Further, millions of tons of textile waste sit in landfills and other settings, hurting low-income countries that produce many of these clothes.  In the U.S., about 85 percent of the clothing Americans consume–nearly 80 pounds per American per year–is sent to landfills as solid waste.  [See “The Global Environmental Injustice of Fast Fashion” in the journal Environmental Health.]

A high-profile public figure had an epiphany along the same lines that should influence all of us.  The late Doug Tompkins was one of the founders of The North Face and later moved on to help establish the apparel chain Esprit.  At the height of Esprit’s success, he sold his stake in the company for about $150 million and moved to Chile, where he embraced a whole new outlook on life and adopted an important new emphasis on ecology.  He bought up properties for conservation purposes, in this way “paying my rent for living on the planet.”  Most tellingly, he said, “I left that world of making stuff that nobody really needed because I realized that all of this needless overconsumption is one of the driving forces of the [environmental] crisis, the mother of all crises.”  [Sierra magazine, September/October 2019.]

Author Marie Kondo fits in here.  She has earned fame as a de-cluttering expert, helping people who feel overwhelmed with too much stuff to tidy up their homes.  Her focus is on reducing clutter that’s already there, so she doesn’t zero in on new purchases.  But I applaud her overall outlook.  As part of de-cluttering, she advises:  As you consider keeping or letting go of an item, hold it in your hands and ask:  “Does this item bring me joy?”  This concept of ensuring that an item brings you joy could apply to new purchases as well, so long as the item bringing you joy is also one you really need.

What should those of us enmeshed in our consumer culture do?  In The Wall Street Journal in July 2019, April Lane Benson, a “shopping-addiction-focused psychologist and the author of ‘To Buy or Not to Buy:  Why We Overshop and How to Stop’,” suggested that if a consumer is contemplating a purchase, she should ask herself six simple questions:  “Why am I here? How do I feel? Do I need this? What if I wait? How will I pay for it? Where will I put it?”

Benson’s list of questions is a good one.  Answering them could go a long way toward helping someone avoid making a compulsive purchase.  But let’s remember:  Benson is talking about a shopper already in a store, considering whether to buy something she’s already selected in her search for something new.  How many shoppers will interrupt a shopping trip like that to answer Benson’s questions?

I suggest a much more ambitious scheme:  Simply resolve not to buy anything you don’t need!

My 11-year-old granddaughter has the right idea:  She’s a minimalist who has rejected any number of gifts from me, including some fetching new clothes, telling me she doesn’t need them.

When I reflect on my life as a shopper, I now understand why and how I became the shopper I did.  Perhaps, in light of my family history and the increasingly consumption-driven culture I’ve lived through, I didn’t really have an option.

But I have regrets:  I’ve wasted countless hours browsing in stores, looking through racks and poring over shelves for things to buy, much of which I didn’t need, then spending additional hours returning some of the things I had just purchased.

These are hours I could have spent far more wisely.  Pursuing my creative work, exercising more often and more vigorously, doing more to help those in need.

Readers:  Please don’t make the mistakes I have.  Adopt my new philosophy.  You’ll have many more hours in your life to pursue far more rewarding goals than acquiring consumer goods you don’t really need.

 

 

 

The Old Man and the Movies

The Sundance Kid rides again!  Not on horseback but in a 1970s sedan.

In his most recent film (and perhaps his last), The Old Man and the Gun, Robert Redford plays a charming real-life bank robber.  Announcing his retirement from acting, he told Ruthe Stein of the San Francisco Chronicle that he chose the part because he identified with the bank robber’s rebellious spirit, and he wanted his last film to be “quirky and upbeat and fun.”

I have a special fondness for Redford that goes back to his role in his first memorable film, Butch Cassidy and the Sundance Kid.  Redford has called it the “first real film experience I ever had” and “the most fun on any film I’ve had.  It changed my life.”

When I saw the film in Chicago shortly after its release, I was struck by the performances of both Paul Newman (my perennial favorite) as Butch Cassidy and newcomer Redford as the Sundance Kid.

Unbeknown to me, there was a real live double of the Sundance Kid out there, waiting to meet me when I moved to LA a short time later:  my soon-to-be husband.  Once he added a mustache to his otherwise great looks, his resemblance to Redford in that film was uncanny, and I dubbed him the Sundance Kid.  I even acquired a poster of Redford in that role to affix to my office wall as a reminder of my new-found love.

The 1969 film, now fifty years old, holds up very well.  In perhaps its most memorable scene, the two leading men plunge from a cliff into roiling waters below, shouting a now more commonly accepted expletive for probably the first time in movie history.

Newman and Redford play leaders of the “Hole in the Wall Gang,” a group that robs banks, successfully for the most part, until robbing a train gets them into serious trouble.  They alienate Mr. E. H. Harrison of the Union Pacific Railroad, who hires special trackers who relentlessly follow Butch and Sundance.

An endearing scene takes place when the two men approach the home of Etta Place, Sundance’s wife.  News stories have alarmed Etta.  “The papers said they had you.  They said you were dead.”  Sundance’s first reaction:  “Don’t make a big thing of it.”  He pauses and reflects.  Then he says, “No.  Make a big thing of it.”  And they enthusiastically embrace.

Redford’s brilliant career includes a large number of notable Hollywood films.  It’s easy for me to name some favorites:  Downhill Racer in 1969, The Candidate in 1972, The Way We Were and The Sting in 1973, All the President’s Men in 1974, The Natural in 1984, and Out of Africa in 1985.  (A few of these especially resonate with me.)  And in All is Lost, as recently as 2013, Redford shines as an older man on the verge of dying alone in troubled ocean waters. Outstanding performances, each and every one.

In recent years, as I became an active supporter of NRDC (the Natural Resources Defense Council), an entity vigorously working on behalf of the environment, I began hearing from Redford, who aligned himself with NRDC’s goals and requested additional donations.  I commend him for his strong support for protecting the future of our country and our planet.  His efforts on behalf of the environment seem even more critical now, as we face increasingly dire problems caused by climate change.

As for Redford’s movie career, my hope is that he chooses not to retire.  Most movie-goers would welcome seeing new films that include him, even in a small role.  In the meantime, I encourage every film buff to see The Old Man and the Gun.  Featuring a number of brief scenes from his earlier movies (plugged into the movie by director David Lowery), the film is a great reminder of a storied Hollywood career.  A career that began with the Sundance Kid.

 

But Is It Reunion-Worthy? (updated for 2017)

 

I’m fed up with closets stuffed with unwearable clothes.  Before I make another purchase, I’m asking myself:  “Is it reunion-worthy?”

Let me explain.

I’ve never been a big spender.  Au contraire.  I’ve always relished hunting for earth-shattering bargains.

But things have changed.  When I go shopping, I have a compelling new reason to think carefully before I buy.

My class reunion.

With a class reunion looming, the prospect of seeing my classmates has led me to rethink how I shop for clothes.

That’s led me to scrutinize my entire wardrobe.  After browsing through a closetful of things I wouldn’t dream of wearing to my reunion, I’m launching a whole new wardrobe strategy.

The new standard for my purchases? Are they reunion-worthy?

I’m a lifelong bargain-hunter, and a favorite pursuit was scouring the racks of reduced apparel at stores ranging from Macy’s and Nordstrom to small local boutiques.  The result?  My closets are filled with bargains that I never wear.

Not that they don’t fit me.  Okay, I’ll admit that a few of them don’t.  I bought some of them in those giddy moments when I actually thought I was going to wear a size 4 again.  (I can dream, can’t I?)

But even those that fit me perfectly tend to inhabit my closet, unworn.  They looked terrific in the dressing room.  Was it the soft lighting?  Was it the “skinny mirrors”?  (Remember how Elaine on the ‘90s sitcom “Seinfeld” accused Barney’s of having skinny mirrors?)

I happily toted my bargains home.  Then came the moment of truth.  I emptied my shopping bags and tried everything on again.

Sadly, by the time I stood in front of my bedroom mirror and concluded that some items didn’t flatter me, the time for returning them had expired, and I was permanently and unalterably stuck with them.

Now with my class reunion coming up, and with closets full of things I wouldn’t dream of wearing when I get there, I’ve launched my new wardrobe strategy.

Here how it works:  I’ll view each potential purchase as something I’d actually wear to my class reunion.

We all know how we want to look at a class reunion.  Whether it’s high school, college, or any other reunion, we want to look fabulous.  Every item has to show us off to our best advantage.

Remember those classmates who were slim and sleek when you were kind of puffy?  Mercifully, thanks to your fitness regime and a healthier diet, you’ve pared down your poundage, and you want everyone to know it.  Is there any question you’ll view every possible purchase with that in mind?  Just ask yourself, “Does this make me look as slim as possible?”  If not, don’t buy it.  It’s simply not reunion-worthy.

Then there’s the question of style.  Take a good look at those clothes that haven’t been stylish for a while.  Do you really want to wear them at the reunion?  Doubtful.  They’re not reunion-worthy.

This awakening has taught me a lesson, and you might benefit from it as well.  Just start taking this approach to everything you buy.  So what if an outfit’s been reduced from $200 to a rock-bottom 39 bucks.  Don’t buy it unless it’s reunion-worthy.  That sweater or jacket you fell in love with at the store?  Even though it’s terribly chic, it’s styled for someone with a totally different shape.  Forget it.  It’s not reunion-worthy.

Shopping online might present a challenge.  You have to take a chance that something will look great…and willing to send it back if it doesn’t.

It’s probably easier to hunt for clothes in your favorite brick-and-mortar stores.  But when you do, try to remember our newly-minted wardrobe strategy.  You’ll finally have closets no longer stuffed with unwearable clothes.  They’ll be filled instead with only those clothes that make you look terrific.

I hope to have a great time at my reunion.  Just in case you’re wondering, I plan to look smashing–garbed in my reunion-worthy duds!

 

[Earlier versions of this post appeared in the San Francisco Chronicle and on this blog.]

 

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

The Summer of Love and Other Random Thoughts

  1.  The CEO pay ratio is now 271-to-1.

 According to the Economic Policy Institute’s annual report on executive compensation, released on July 20, chief executives of America’s 350 largest companies made an average of $15.6 million in 2016, or 271 times more than what the typical worker made last year.

The number was slightly lower than it was in 2015, when the average pay was $16.3 million, and the ratio was 286-to-1.   And it was even lower than the highest ratio calculated, 376-to-1 in 2000.

But before we pop any champagne corks because of the slightly lower number, let’s recall that in 1989, after eight years of Ronald Reagan in the White House, the ratio was 59-to-1, and in 1965, in the midst of the Vietnam War and civil rights turmoil, it was 20-to-1.

Let’s reflect on those numbers for a moment.  Just think about how distorted these ratios are and what they say about our country.

Did somebody say “income inequality”?

[This report appeared in the San Francisco Chronicle on July 21, 2017.]

 

  1. Smiling

 I’ve written in this blog, at least once before, about the positive results of smiling.  [Please see “If You’re Getting Older, You May Be Getting Nicer,” published on May 30, 2014.]

But I can’t resist adding one more item about smiling.  In a story in The Wall Street Journal in June, a cardiologist named Dr. John Day wrote about a woman, aged 107, whom he met in the small city of Bapan, China.  Bapan is known as “Longevity Village” because so many of its people are centenarians (one for every 100 who live there; the average in the U.S. is one in 5,780).

Day asked the 107-year-old woman how she reached her advanced age.  Noting that she was always smiling, he asked if she smiled even through the hard times in her life.  She replied, “Those are the times in which smiling is most important, don’t you agree?”

Day added the results of a study published in Psychological Science in 2010.  It showed that baseball players who smiled in their playing-card photographs lived seven years longer, on average, than those who looked stern.

So, he wrote, “The next time you’re standing in front of a mirror, grin at yourself.  Then make that a habit.”

[Dr. Day’s article appeared in The Wall Street Journal on June 24-25, 2017.]

 

  1. The Summer of Love

This summer, San Francisco is awash in celebrations of the “Summer of Love,” the name attached to the city’s summer of 1967.   Fifty years later, the SF Symphony, the SF Jazz Center, a bunch of local theaters, even the Conservatory of Flowers in Golden Gate Park, have all presented their own take on it.

Most notably, “The Summer of Love Experience,” an exhibit at the de Young Museum in Golden Gate Park, is a vivid display of the music, artwork, and fashions that popped up in San Francisco that summer.

As a happy denizen of San Francisco for the past 12 years, I showed up at the de Young to see the exhibit for myself.

My favorite part of the exhibit was the sometimes outrageous fashions artfully displayed on an array of mannequins.  Not surprisingly, they included a healthy representation of denim.  Some items were even donated by the Levi’s archives in San Francisco.  [Please see the reference to Levi’s in my post, “They’re My Blue Jeans and I’ll Wear Them If I Want To,” published in May.]

Other fashions featured colorful beads, crochet, appliqué, and embroidery, often on silk, velvet, leather, and suede.  Maybe it was my favorite part of the exhibit because I’ve donated clothing from the same era to the Chicago History Museum, although my own clothing choices back then were considerably different.

Other highlights in the exhibit were perfectly preserved psychedelic posters featuring rock groups like The Grateful Dead, The Doors, and Moby Grape, along with record album covers and many photographs taken in San Francisco during the summer of 1967.  Joan Baez made an appearance as well, notably with her two sisters in a prominently displayed anti-Vietnam War poster.  Rock and roll music of the time is the constant background music for the entire exhibit.

In 1967, I may have been vaguely aware of San Francisco’s Summer of Love, but I was totally removed from it.  I’d just graduated from law school, and back in Chicago, I was immersed in studying for the Illinois bar exam.  I’d also begun to show up in the chambers of Judge Julius J. Hoffman, the federal district judge for whom I’d be a law clerk for the next two years.  [Judge Hoffman will be the subject of a future post or two.]

So although the whole country was hearing news stories about the antics of the thousands of hippies who flocked to Haight-Ashbury and Golden Gate Park in San Francisco, my focus was on my life in Chicago, with minimal interest in what was happening 2000 miles away.  For that reason, much of the exhibit at the de Young was brand-new to me.

The curators of the exhibit clearly chose to emphasize the creativity of the art, fashion, and music of the time.  At the same time, the exhibit largely ignores the downside of the Summer of Love—the widespread use of drugs, the unpleasant changes that took place in the quiet neighborhood around Haight-Ashbury, the problems created by the hordes of young people who filled Golden Gate Park.

But I was glad I saw it–twice.

You may decide to come to San Francisco to see this exhibit for yourself.

If you do, please don’t forget:  “If you’re going to San Francisco, be sure to wear some flowers in your hair.”

 

 

Declare Your Independence: Those High Heels Are Killers

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body.

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.”

Like those young women, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I managed to secure male attention nevertheless.

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters suffer in spades.

The recent trend toward higher and higher heels is disturbing.  I’m baffled by women, especially young women, who buy into the mindset that they must follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.

When I watch TV, I see too many women wearing stilettos that force them into the ungainly walk I briefly sported so long ago.  I can’t help noticing the women on late-night TV shows who are otherwise smartly attired and often very smart (in the other sense of the word), yet wear ridiculously high heels that force them to greet their hosts with that same ungainly walk.  Some appear on the verge of toppling over.  And at a recent Oscar awards telecast, women tottered to the stage in ultra-high heels, often accompanied by escorts who kindly held onto them to prevent their embarrassing descent into the orchestra pit.

The women who, like me, have adopted lower-heeled shoes strike me as much smarter and much less likely to fall on their attractive (and sometimes surgically-enhanced) faces.

Here’s another example.  When I sat on the stage of Zellerbach Hall at the Berkeley commencement for math students a few years ago, I was astonished that many if not most of the women graduates hobbled across the stage to receive their diplomas in three- and four-inch-high sandals.  I was terrified that these super-smart math students would trip and fall before they could grasp the document their mighty brain-power had earned.  (Fortunately, none of them tripped, but I could nevertheless imagine the foot-pain that accompanied the joy of receiving their degrees.)

Foot-care professionals soundly support my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

The San Francisco Chronicle recently questioned Dr. Amol Saxena, a podiatrist and foot and ankle surgeon who practices in Palo Alto (and assists Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He doesn’t endorse 3-inch heels and points out that celebrities wear them for only a short time (for example, on the red carpet), not all day.  To ensure a truly comfortable shoe, he adds, don’t go above a 1.5 inch heel.  If you insist on wearing higher heels, limit how much time you spend in them.

Some encouraging changes may be afoot.  The latest catalog from Nordstrom, one of America’s major shoe-sellers, features a large number of lower-heeled styles along with higher-heeled numbers.  Because Nordstrom is a bellwether in the fashion world, its choices can influence shoe-seekers.  Or is Nordstrom reflecting what its shoppers have already told the stores’ decision-makers?  The almighty power of the purse—how shoppers are choosing to spend their money–probably plays a big role here.

Beyond the issue of comfort, let’s remember that high heels present a far more urgent problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When women need to flee a dangerous situation, it’s pretty obvious that high heels can handicap their ability to escape.

How many other needless deaths have resulted from hobbled feet?

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I urge the women of America to declare their independence from high-heeled shoes.

If you’re currently wearing painful footwear, bravely throw those shoes away, or at the very least, toss them into the back of your closet.   Shod yourself instead in shoes that allow you to walk—and if need be, run—in comfort.

Your wretched appendages, yearning to be free, will be forever grateful.

 

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]