Tag Archives: children

I Shouda Ran

I just came across some great news for joggers.  Researchers have found that strenuous exercise like jogging does NOT boost the risk of arthritis in one’s knees.  A recent study enlisted nearly 1,200 middle-aged and older people at high risk for knee arthritis.  Result?  After 10 years, those who did strenuous activities like jogging and cycling were no more likely to be diagnosed with arthritis than those who did none. (See the July/August 2020 issue of Nutrition Action, noting a study reported in the New England Journal of Medicine.)

And according to a writer in The Washington Post, most data show that running actually helps keep knee joints lubricated.  (See the report by John Briley on August 6, 2020.)

Hmmm…

So…maybe I shoulda ran?

What?

I’ll explain.

When my daughters were small, my husband and I often relied on PBS kids’ programming to keep us from going bananas whenever we were home with them for more than a few hours.

I’m still indebted to “Sesame Street” and “Mr. Rogers’ Neighborhood” for offering wonderfully positive content that expanded our daughters’ minds.

I can still remember many of Fred Rogers’s episodes and his delightful music.  The recent films (e.g., “A Beautiful Day in the Neighborhood”) that highlight his music and the many layers of his unfailing kindness are moving tributes to everything he did.  (I obliquely noted Rogers’s important role in our family when I briefly mentioned him in my 2011 novel, Jealous Mistress.)

Similarly, I can’t forget countless “Sesame Street” sketches and songs we watched over and over again. In addition to stalwarts like Kermit the Frog and Big Bird, I loved less-prominent Muppet characters like Don Music, who’d take out his creative frustrations by crashing his head on his piano keyboard.

One “Sesame Street” sketch I vividly recall focused on words than rhymed with “an.”

The setting is a rundown alley in a big city.  Tall buildings loom in the distance.  As the sketch begins, two Muppets garbed as gangsters breathlessly arrive at this spot.  The savvier gangster tells his partner Lefty that “We got the ‘Golden AN’.”

The word “AN” is clearly written in bold upper-case letters on a metal object he’s holding.  Explaining their “plan,” he points to a “tan van” and says, “This is the plan. You see that van? You take the Golden An to the tan van.  You give it to Dan, who will give it to Fran.”  He adds:  “Everything I’m telling you about the plan rhymes with AN.”  He takes off, leaving Lefty alone.

Lefty, who’s pretty much of a dolt, repeats the plan out loud a couple of times while a Muppet cop is watching and listening.  The cop approaches, identifies himself as “Stan…the man,” and tells Lefty he’s going to get “10 days in the can for stealing the Golden An.”

Lefty then chides himself:  “I shoulda ran.”

This carefully crafted sketch was clearly intended to teach little kids about words that rhyme with “an,” although much of it seemed aimed at parents and other adults watching along with the kids.  How many little ones knew the meaning of “the can”?  The bad grammar in the sketch (“I shoulda ran”) was forgivable because kids watching “Sesame Street” didn’t really notice it, and the whole thing was so darned funny.

But what has stayed with me over the decades is the final line:  I shoulda ran.

When I was growing up, I always liked running fast, and I rode my fat-tire Schwinn bike all over my neighborhood.  So I wasn’t indolent.  But as I grew older and entered public high school in Chicago, I encountered the blatantly sexist approach to sports.  Aside from synchronized swimming, my school offered no team sports for girls.  So although I would have loved to be on a track team, that simply wasn’t possible.  Girls couldn’t participate in gymnastics, track, basketball, baseball, tennis, or any of the other teams open to boys our age.

We were also actively discouraged from undertaking any sort of strenuous physical activity.  It was somewhat ironic that I applied to be, and became, the sports editor of my high school yearbook because I was completely shut out of the team sports that I covered in that yearbook .  And I foolishly gave up my coveted spot in the drama group to do it—what a mistake!

I had a somewhat different experience during my single semester in school in Los Angeles, where I spent the first half of 8th grade.  Although sexism was equally pervasive there, girls at least had a greater opportunity to benefit from physical activity.  Because of the beautiful weather, we played volleyball outdoors every day, and I actually learned not to be afraid of the ball!  I was prepared, when we returned to Chicago (reluctantly on my part), to enjoy a similar level of activity during my four years of high school.  But that would not happen.   The girls’ P.E. classes were a joke, a pathetic attempt at encouraging us to move our bodies.  And things didn’t begin to change until 1972, when Title IX was enacted into law.

Over the years, I continued to ride a bike wherever I lived and whenever weather permitted. I took up brisk walking and yoga as well.  And I sometimes thought about running.

Jogging– less intensive running–took off in the late 1970s and early 1980s.  Why didn’t I begin to jog?

There was a bunch of reasons.  First, I was afraid of damaging my knees.  I’ve always loved aerobic dancing, the kind popularized by Jacki Sorensen.  I’d jump along with the music in my favorite Jacki tape, and I began to notice that jumping was possibly beginning to wear away the cartilage in my knee joints because occasional pain resulted. So I kept dancing, but I stopped jumping.  I figured that running would place even further stress on my knees.

And then there was Jim Fixx.

I didn’t know a lot about Jim Fixx.  He became a media celebrity when he published his best-selling book, The Complete Book of Running, in 1977, and his claims about the health benefits of jogging suddenly showed up on the news.  But in 1977, I had a brand-new baby and a toddler, along with a challenging part-time job, and I couldn’t focus on starting something new like jogging.  By the time I was getting ready to launch into it, I heard the news that Fixx had died of a heart attack while jogging.  He was 52.

Fixx’s death shook me up.  I didn’t know at the time that he may have had a genetic predisposition to heart trouble and he had lived a stressful and unhealthy life as an overweight heavy smoker before he began running at age 36.   All that I knew was that this exemplar of health through running had died, while jogging, at age 52.

Chicago weather also stood in my way.  Happily ensconced in an area that allowed our family to ride our bikes along Lake Michigan and quiet residential streets, and where I could take long and pleasant walks with my husband, I was reasonably active outdoors during the six months of the year when good weather prevailed.  But during the harsh winters, confined indoors, I had less success.  I played my Jacki tapes, I tried using a stationary bike (it never fit me comfortably), and I sampled a local gym.  But I didn’t pursue strenuous exercise.

Now, learning about the recent evidence I’ve noted–that, if I’d jogged, my knees might have been OK after all–I regret that choice.  My current climate allows me to be outside almost every day, and I take advantage of it by briskly walking about 30 minutes daily, much of it uphill.  So that’s my workout now, and it’s a pretty good one.

But I probably would have loved running all those years.

It’s a bit late to start now, but I can’t help thinking:  I shoulda ran.

“Who was that masked man?”

If you ever watched “The Lone Ranger,” a TV series that appeared from 1949 to 1957, you probably remember the question that ended every episode:  “Who was that masked man?”  The Lone Ranger, a Texas Ranger turned vigilante who became a pop-culture hero fighting for truth and justice, wore a mask to obscure his identity.

The question seems more appropriate today than ever before.  With most of us donning masks—or another sort of face-covering—it’s impossible to see the entire face of anyone you encounter in the outside world.  We simply have to trust that we won’t run into any evildoers lurking near us wherever we go.  So far I haven’t felt that I needed someone like the L.R. to come to my rescue.

There’s another concern, however.  When I take my daily neighborhood stroll, I find it troubling that, although most of us are now required to wear masks in public, many people I encounter are walking or jogging sans mask.  The most annoying are the joggers, who don’t seem to care that they are exhaling a whole load of droplets every time they breathe, and heck, their droplets just might be contaminated with Covid-19.

In addition to wearing a mask, walkers need to keep at least 6 feet away from each other, and according to an expert quoted in The Washington Post a few days ago, joggers need to run at least 10 feet away from everyone else.  Although some of the people I encounter try to observe those distances, many don’t.

As I walk, I often mutter into my mask (usually a colorful scarf covering my nose and mouth), trying to restrain my irritation with those violating the current guidelines. [Please see my blog post, “Join the ranks of the scarf-wearers,” at https://susanjustwrites.wordpress.com/2020/04/06/join-the-ranks-of-the-scarf-wearers/.%5D

My mask has actually turned out to be a great way to muffle what I’m not merely thinking but actually saying.  (Sotto voce, of course.)  A favorite:  “Jerk.”  Or worse.  And lately I’ve been borrowing the title of a hilarious children’s book, “The Stupids Die.”

When we were raising our two daughters in the 1980s, we enthusiastically read countless books to them.  Among our favorites were those written and illustrated by James Marshall.  Marshall is probably best known for his delightful series featuring two anthropomorphized hippos called George and Martha.  The series includes five books published between 1972 and 1988.

George and Martha were “best friends,” and one of the things we loved about them was that they were non-gender-specific friends.  So although Martha would sometimes be drawn wearing a hair bow or a colorful skirt, and George sometimes sported a casual fedora, both Martha and George liked to do the same things and go to the same places.  And no matter what transpired, they were always “best friends.”

But James Marshall didn’t confine his talents to the George and Martha series.  As an illustrator, he collaborated with the writer Harry Allard, who wrote a series of four books featuring a family called The Stupids.  Marshall’s colorful illustrations for these books, published between 1977 and 1989, are knee-slappingly hilarious.

The Stupids are colossally stupid, so much so that in “The Stupids Die,” the Stupids leap to the conclusion that they’re dead when a power outage makes their lights go out, turning their home totally dark.  The truth is revealed at the end, and the reader is left laughing at how astoundingly foolish The Stupids are.

The series had its critics, who griped that the stories promoted low self-esteem and negative behavior.  But most kids loved the stories, and copies are still selling to grown-up fans on Amazon.com.

As I witness the choice made by some walkers and joggers on my route–the choice not to keep the prescribed distance or to wear a mask to protect themselves and others from the potentially virus-saturated droplets in their exhalations– “The Stupids Die” keeps reverberating in my head.

Wearing my own mask has the unexpected benefit of allowing me to say whatever I want as I pass these non-mask-wearing and non-distance-keeping people, who are endangering their own lives as well as mine. So in addition to muttering “Jerk” and other expletives, I frequently mutter “The Stupids Die.”

If anyone should hear me, I can promptly explain that I’m simply recalling the title of a favorite children’s book.  And if they want to interpret those words as words that apply to them, I hope they will do just that.

I’m well aware that most victims of Covid-19 are very smart people who contracted the disease through no fault of their own.  I do NOT include them among “the Stupids.”  And I strongly condemn the violent assaults that have recently erupted, where mask-wearers have attacked those who weren’t wearing masks.

But I do judge harshly those in my own surroundings who don’t appear to care about others, and I declare the following:

To everyone walking and jogging, enjoying the fresh air and sunshine that surround us this May, please remember to wear a mask.  Please remember to stay the correct distance away from me.

And for your own sake, please remember that “The Stupids Die.”

 

 

 

 

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

Who the Heck Knows?

I have a new catch phrase:  “Who the heck knows?”

I started using it last fall, and ever since then I’ve found that it applies to almost everything that might arise in the future.

I don’t claim originality, but here’s how I came up with it:

At a class reunion in October, I was asked to be part of a panel of law school classmates who had veered off the usual lawyer-track and now worked in a totally different area.

Specifically, I was asked to address a simple question:  Why did I leave my work as a lawyer/law professor and decide to focus primarily on writing?

First, I explained that I’d always loved writing, continued to write even while I worked as a lawyer, and left my law-related jobs when they no longer seemed meaningful.  I added that my move to San Francisco led to launching my blog and publishing my first two novels.

I concluded:

“If I stay healthy and my brain keeps functioning, I want to continue to write, with an increasing focus on memoirs….  I’ll keep putting a lot of this kind of stuff on my blog.  And maybe it will turn into a book or books someday.

“Who the heck knows?”

 

After I said all that, I realized that my final sentence was the perfect way to respond to almost any question about the future.

Here’s why it seems to me to apply to almost everything:

None of us knows what the next day will bring.  Still, we think about it.

In “Men Explain Things to Me,” the author Rebecca Solnit notes “that we don’t know what will happen next, and the unlikely and the unimaginable transpire quite regularly.”  She finds uncertainty hopeful, while viewing despair as “a form of certainty,” certainty that that “the future will be a lot like the present or will decline from it.”

Let’s cast certainty aside and agree, with Solnit, that uncertainty is hopeful.  Let’s go on to question what might happen in the uncertain future.

For example:

We wonder whether the midterm elections will change anything.

We wonder whether our kids will choose to follow our career choices or do something totally different.

We wonder whether our family history of a deadly disease will lead to having it ourselves.

We wonder whether to plan a trip to Peru.

We wonder whether we’re saving enough money for retirement.

We wonder how the U.S. Supreme Court will rule in an upcoming case.

We wonder what our hair will look like ten years from now.

We wonder what the weather will be like next week.

And we wonder what the current occupant of the White House will say or do regarding just about anything.

 

You may have an answer in mind, one that’s based on reason or knowledge or probability.   But if you’re uncertain…in almost every case, the best response is:  Who the heck knows?

If you’re stating this response to others, I suggest using “heck” instead of a word that might offend anyone.  It also lends a less serious tone to all of the unknowns out there, some of which are undoubtedly scary.

If you prefer to use a more serious tone, you can of course phrase things differently.

But I think I’ll stick with “Who the heck knows?”

Warning:  If you spend any time with me, you’ll probably hear me say it, again and again.

But then, who the heck knows?

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Quote

Audrey Hepburn and Me

I never thought I had a single thing in common with Audrey Hepburn.  She was tall and decidedly slim.  I’m short and, uh, not exactly slim.  She was a brunette with enormous brown eyes.  I’m a redhead with almond-shaped but not-so-enormous hazel eyes.  She was a famed film star who won an Oscar at 24 (for 1953’s Roman Holiday) while my adolescent dreams of becoming an actress never became reality.

So I never saw myself as having anything in common with this glamorous star of the ’50s and ’60s.  But a quick glance at a recent magazine article has convinced me that I have a few things in common with Audrey after all.

The article, appearing in the May issue of Vanity Fair, is based on a new book, Audrey in Rome, written by her younger son, Luca Dotti.  Luca lived with Audrey in Rome from the time of his birth in 1970 until she left for Switzerland (and he went off to a Swiss boarding school) in 1986.  As the magazine cover proclaims, in his book he recalls “the secrets of her iconic style.”

What were some of these secrets?  Well, for one thing, she was “fond of kerchiefs tied under the chin (not wound around and fastened in back in the French manner).”  Her love of sous-chin kerchiefs is apparent in a 1970 photo showing Audrey in a fabulous Givenchy coat and a scarf tied under her chin.

According to Luca, Audrey’s scarves were “a bit of a vice.”  Although she wasn’t “like Imelda Marcos and shoes,” she had “maybe 30 or 40” scarves.  In Rome, she often wore them along with big sunglasses as a disguise, enabling her “to do her shopping without having…crowds” following her.

This is one style-revelation I share with Audrey Hepburn.  My love of scarves, like hers, could be called a vice, but in view of the small amount of space they occupy and the small sums of money they cost, they’re a pretty harmless one.  I have a colorful collection in every possible fabric, suitable for every season, some bestowed on me as charming gifts, others purchased by me in a weak moment.

I admit I’ve never had crowds following me.  But I wear scarves (usually tied under my chin) for my own reasons.  In chilly weather, they keep my head warm.  On warmer days, they shield my curly hair from humidity and wind.

Childhood photos taken by my father show me, like Audrey, wearing scarves tied beneath my chin.  Ever since then, I’ve worn scarves no matter where I’ve made my home—from Chicago to Boston to Los Angeles.  Now, living in breezy San Francisco, I almost never leave home without a scarf in my jacket pocket, prepared to withstand whatever breezes the ocean blows my way.

Some have ridiculed my penchant for wearing scarves.  A friend once muttered that I liked to wear “babushkas.”  That hurt.  But now I can point to Audrey Hepburn as a scarf-loving style icon who, like me, wore scarves tied beneath her chin.

Another secret revealed by Luca is Audrey’s choice of footwear.  Generally basing her style choices on “simplicity and practicality,” she preferred to wear ballerina flats and low heels.  Vanity Fair claims that she wore them partly to accentuate her long feet, “adding to her elegant attenuation.”  (Huh?  Do you know any women with long feet who want to accentuate them?)  But even VF admits the far more likely reason:  she wore them so she “could walk comfortably.”

So here’s another preference I share with Audrey.  Long ago I gave up wearing high heels.  Like Audrey, I like to stride purposefully through the city, and wearing anything but low heels makes that impossible.  Every day I see women struggling with high heels that inhibit their freedom to move through life with ease.  I ache to tell them to forgo those high heels, and like Audrey and me, walk comfortably and safely wherever they go.

[Please note:  I’ve written another post on this blog, “High Heels Are Killers,” explaining at greater length my opinion of high heels.]

If truth be told, when I was younger, I wasn’t a big fan of Audrey Hepburn.  Maybe it was the way Hollywood portrayed her that was to blame.  After Roman Holiday (in which she fell in love with reasonably age-appropriate Gregory Peck), she was paired with male leads who were far too old for her.  At 28 she was supposedly smitten by Gary Cooper, then 56 (and looking even older), in Love in the Afternoon and by 58-year-old Fred Astaire in Funny Face.  I found these pairings simply baffling.  Why would radiant young Audrey fall for men twice her age?  At the time, I was unaware of the way Hollywood worked back then.  It’s clear to me now that she was complying with the demands of the movie moguls who dictated most of the roles she played.

No wonder she confided to friends that her favorite role was that of the nun in The Nun’s Story.  No superannuated men were slobbering over her in that role!

My view of Audrey Hepburn evolved as I learned more about her.  In her later years, she became an activist on behalf of UNICEF, traveling to more than 20 countries around the globe to advocate for the world’s most vulnerable children.  Her advocacy has endeared her to me, a fellow advocate for the underprivileged.

Moreover, during those years, she openly chose to welcome growing older.  Luca remembers that she “was always a little bit surprised by the efforts women made to look young.”  By contrast, “she was actually very happy about growing older because it meant more time for herself, more time for her family, and separation from the frenzy of youth and beauty that is Hollywood.”  She saw aging as part of the circle of life.

Audrey liked to say that “true beauty in a woman is reflected in her soul. It’s the caring that she lovingly gives, the passion that she shows. The beauty of a woman only grows with passing years.”

Some may remember Audrey Hepburn as a stunning style icon, but in my view, she should be remembered for much, much more.