Tag Archives: health

I Shouda Ran

I just came across some great news for joggers.  Researchers have found that strenuous exercise like jogging does NOT boost the risk of arthritis in one’s knees.  A recent study enlisted nearly 1,200 middle-aged and older people at high risk for knee arthritis.  Result?  After 10 years, those who did strenuous activities like jogging and cycling were no more likely to be diagnosed with arthritis than those who did none. (See the July/August 2020 issue of Nutrition Action, noting a study reported in the New England Journal of Medicine.)

And according to a writer in The Washington Post, most data show that running actually helps keep knee joints lubricated.  (See the report by John Briley on August 6, 2020.)

Hmmm…

So…maybe I shoulda ran?

What?

I’ll explain.

When my daughters were small, my husband and I often relied on PBS kids’ programming to keep us from going bananas whenever we were home with them for more than a few hours.

I’m still indebted to “Sesame Street” and “Mr. Rogers’ Neighborhood” for offering wonderfully positive content that expanded our daughters’ minds.

I can still remember many of Fred Rogers’s episodes and his delightful music.  The recent films (e.g., “A Beautiful Day in the Neighborhood”) that highlight his music and the many layers of his unfailing kindness are moving tributes to everything he did.  (I obliquely noted Rogers’s important role in our family when I briefly mentioned him in my 2011 novel, Jealous Mistress.)

Similarly, I can’t forget countless “Sesame Street” sketches and songs we watched over and over again. In addition to stalwarts like Kermit the Frog and Big Bird, I loved less-prominent Muppet characters like Don Music, who’d take out his creative frustrations by crashing his head on his piano keyboard.

One “Sesame Street” sketch I vividly recall focused on words than rhymed with “an.”

The setting is a rundown alley in a big city.  Tall buildings loom in the distance.  As the sketch begins, two Muppets garbed as gangsters breathlessly arrive at this spot.  The savvier gangster tells his partner Lefty that “We got the ‘Golden AN’.”

The word “AN” is clearly written in bold upper-case letters on a metal object he’s holding.  Explaining their “plan,” he points to a “tan van” and says, “This is the plan. You see that van? You take the Golden An to the tan van.  You give it to Dan, who will give it to Fran.”  He adds:  “Everything I’m telling you about the plan rhymes with AN.”  He takes off, leaving Lefty alone.

Lefty, who’s pretty much of a dolt, repeats the plan out loud a couple of times while a Muppet cop is watching and listening.  The cop approaches, identifies himself as “Stan…the man,” and tells Lefty he’s going to get “10 days in the can for stealing the Golden An.”

Lefty then chides himself:  “I shoulda ran.”

This carefully crafted sketch was clearly intended to teach little kids about words that rhyme with “an,” although much of it seemed aimed at parents and other adults watching along with the kids.  How many little ones knew the meaning of “the can”?  The bad grammar in the sketch (“I shoulda ran”) was forgivable because kids watching “Sesame Street” didn’t really notice it, and the whole thing was so darned funny.

But what has stayed with me over the decades is the final line:  I shoulda ran.

When I was growing up, I always liked running fast, and I rode my fat-tire Schwinn bike all over my neighborhood.  So I wasn’t indolent.  But as I grew older and entered public high school in Chicago, I encountered the blatantly sexist approach to sports.  Aside from synchronized swimming, my school offered no team sports for girls.  So although I would have loved to be on a track team, that simply wasn’t possible.  Girls couldn’t participate in gymnastics, track, basketball, baseball, tennis, or any of the other teams open to boys our age.

We were also actively discouraged from undertaking any sort of strenuous physical activity.  It was somewhat ironic that I applied to be, and became, the sports editor of my high school yearbook because I was completely shut out of the team sports that I covered in that yearbook .  And I foolishly gave up my coveted spot in the drama group to do it—what a mistake!

I had a somewhat different experience during my single semester in school in Los Angeles, where I spent the first half of 8th grade.  Although sexism was equally pervasive there, girls at least had a greater opportunity to benefit from physical activity.  Because of the beautiful weather, we played volleyball outdoors every day, and I actually learned not to be afraid of the ball!  I was prepared, when we returned to Chicago (reluctantly on my part), to enjoy a similar level of activity during my four years of high school.  But that would not happen.   The girls’ P.E. classes were a joke, a pathetic attempt at encouraging us to move our bodies.  And things didn’t begin to change until 1972, when Title IX was enacted into law.

Over the years, I continued to ride a bike wherever I lived and whenever weather permitted. I took up brisk walking and yoga as well.  And I sometimes thought about running.

Jogging– less intensive running–took off in the late 1970s and early 1980s.  Why didn’t I begin to jog?

There was a bunch of reasons.  First, I was afraid of damaging my knees.  I’ve always loved aerobic dancing, the kind popularized by Jacki Sorensen.  I’d jump along with the music in my favorite Jacki tape, and I began to notice that jumping was possibly beginning to wear away the cartilage in my knee joints because occasional pain resulted. So I kept dancing, but I stopped jumping.  I figured that running would place even further stress on my knees.

And then there was Jim Fixx.

I didn’t know a lot about Jim Fixx.  He became a media celebrity when he published his best-selling book, The Complete Book of Running, in 1977, and his claims about the health benefits of jogging suddenly showed up on the news.  But in 1977, I had a brand-new baby and a toddler, along with a challenging part-time job, and I couldn’t focus on starting something new like jogging.  By the time I was getting ready to launch into it, I heard the news that Fixx had died of a heart attack while jogging.  He was 52.

Fixx’s death shook me up.  I didn’t know at the time that he may have had a genetic predisposition to heart trouble and he had lived a stressful and unhealthy life as an overweight heavy smoker before he began running at age 36.   All that I knew was that this exemplar of health through running had died, while jogging, at age 52.

Chicago weather also stood in my way.  Happily ensconced in an area that allowed our family to ride our bikes along Lake Michigan and quiet residential streets, and where I could take long and pleasant walks with my husband, I was reasonably active outdoors during the six months of the year when good weather prevailed.  But during the harsh winters, confined indoors, I had less success.  I played my Jacki tapes, I tried using a stationary bike (it never fit me comfortably), and I sampled a local gym.  But I didn’t pursue strenuous exercise.

Now, learning about the recent evidence I’ve noted–that, if I’d jogged, my knees might have been OK after all–I regret that choice.  My current climate allows me to be outside almost every day, and I take advantage of it by briskly walking about 30 minutes daily, much of it uphill.  So that’s my workout now, and it’s a pretty good one.

But I probably would have loved running all those years.

It’s a bit late to start now, but I can’t help thinking:  I shoulda ran.

Two Words

Do you remember this scene in the 1967 film “The Graduate”?

New college graduate Benjamin encounters a friend of his father’s at a party.  The friend pulls him aside and says, “I just want to say one word to you. Just one word.  Plastics.”

That advice may have helped college grads in ‘67, but the world we face today is very different.

In light of the raging global pandemic, and the stress it’s placed on all of us, I now have two words for you.  Elastic waists.

Many of us have recently begun wearing clothes with elastic waists.

On June 26, The Wall Street Journal noted:  “The Covid 15 Have Made Our Clothes Too Tight.”  Reporter Suzanne Kapner clearly outlined the problem.  “People spent the spring sheltering at home in sweatpants, perfecting banana-bread recipes and indulging in pandemic-induced stress-eating.”  And while most of us have escaped Covid-19, we haven’t escaped the “Covid 15”—the weight-gain pushing Americans into “roomier wardrobes.”

Hence the widespread adoption of elastic waists.

Many shoppers have jumped on the scale, been horrified, and concluded that they needed to buy new clothes to fit their new shapes.  One woman, unable to zip up her pants, got on her scale.  “Holy moly,” she told Kapner, “I gained 11 pounds in three weeks.”

Kapner cited more evidence:  First, Google-searches for “elastic waist” have spiked. Further, body-measuring apps have reported a jump in people choosing looser fits to accommodate their new profiles.  As the CEO of one such app observed, people are “sizing up” because they’ve gained weight.  Less active and more confined, they’re “eating more, either out of stress or boredom.”

In light of this phenomenon, some retailers are increasing orders of clothes in bigger sizes.  They’re also painfully aware of something else:  the rise in returns because of size-changes.  Returns have probably doubled in the past three months, according to a software company that processes returns for over 200 brands. And when customers order a clothing item (in their former size), and it needs to be exchanged for a larger size, those retailers who offer free shipping and free returns find that all of these additional returns are eating into their profits.

This move into larger sizes and elastic waists doesn’t surprise me.  I long ago adopted wearing pants with elastic waists.  Not all of my pants, to be sure.  But many of them.

It probably started when I was pregnant with my first child.  As my abdominal area began to expand, I searched my closet and came across some skirts and pants with elastic waistbands.  I discovered that I could wear these throughout my pregnancy, adding extra elastic as needed.  I bought some maternity clothes as well, but the pants with those stretchy elastic waistbands allowed me to avoid buying a lot of new items.

Over the years, I’ve continued to wear elastic-waist pants, enjoying the comfort they afford.  (Yes, I also wear pants and jeans with stitched-down waistbands that fit me.)

I can understand why there’s a new emphasis on buying elastic waists in lieu of bigger sizes.  A bigger size might be OK for right now, but you probably hope to return to your former size sometime.  Elastic waists are exactly what they claim to be:  elastic.  That means they can expand, but they can also contract.

Both women and men can benefit from wearing elastic waists, at least until they’ve shed the additional pounds they’ve recently acquired.

Many women have traditionally turned to elastic waists because they don’t have the typical “hourglass” shape women are expected to sport.  They have what’s been called an “apple” shape, with a somewhat larger waist measurement than most women have.  In the past, they might have purchased clothes with a tight waistband and then had a tailor make the waistband bigger.

But right now, tailoring clothes is almost impossible. Who’s leaving the safety of home simply to find a tailor to alter a waistband?  The pandemic has put such tailoring out of reach for most of us.  And if an elastic waist makes it unnecessary, it’s saving you the trouble and expense of seeking out a tailor.

Men with expanding waists have also benefited from elastic waists.  The popularity of sweatpants and other casual wear with elastic waists for men are proof of that.

I recognize the role stress is playing in our lives right now, and it’s pretty obvious that we can attribute some weight-gain to the increased level of stress.  So, to avoid buying more and more elastic waists and/or bigger sizes, we need to reduce stress as much as we can.

The advice we’ve all heard for a long time still holds, and it probably applies now more than ever.  At the risk of sounding preachy, I’m adding a few new tips to the tried-and-true list.  (Feel free to skip it if you think you’ve heard it all before.)

  • Be more physically active. Please remember:  You don’t need to go to a gym or even do vigorous workouts at home.  Simply taking a fairly fast-paced stroll in your neighborhood is good enough.
  • Avoid fixating on TV news, especially the bad stuff.
  • Watch distracting TV programing instead (this includes reliably funny films like “Some Like It Hot” and “What’s Up, Doc?” if you can find them).
  • Play music that makes you happy.
  • Connect with friends and family by phone, email, or text (or by writing actual letters).
  • Give meditation a try, just in case it may help you.
  • Try to follow a diet focused on fresh fruit, veggies, high-fiber carbs, and lean protein.
  • Curl up with a good book. (Forgive me for plugging my three novels,* but each one is a fast read and can take you to a different time and place, a definitely helpful distraction.)

Although I admit that I’m still wearing the elastic waists I already own, I’ve so far been able to avoid the “Covid 15” that might require buying new ones.  What’s helped me?

First, briskly walking in my neighborhood for 30 minutes every day.  Second, resisting the lure of chocolate as much I can.  Instead, I’ve been relying on heaps of fruits, veggies, popcorn, pretzels, and sugarless gum.  (My chief indulgences are peanut butter and fig bars.)  It’s as simple as that.

Maybe you can avoid it, too.  Good luck!

 

*A Quicker Blood, Jealous Mistress, and Red Diana are all available as paperbacks and e-books on Amazon.com.

 

 

 

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Coal: A Personal History

It’s January, and much of the country is confronting freezing temperatures, snow, and ice.  I live in San Francisco now, but I vividly remember what life is like in cold-weather climates.

When I was growing up on the North Side of Chicago, my winter garb followed this pattern:

Skirt and blouse, socks (usually short enough to leave my legs largely bare), a woolen coat, and a silk scarf for my head.  Under my coat, I might have added a cardigan sweater.  But during the freezing cold days of winter (nearly every day during a normal Chicago winter), I was always COLD—when I was outside, that is.

My parents were caring and loving, but they followed the norms of most middle-class parents in Chicago during that era.  No one questioned this attire.  I recall shivering whenever our family ventured outside for a special event during the winter.  I especially remember the excitement of going downtown to see the first showing of Disney’s “Cinderella.”  Daddy parked our Chevy at an outdoor parking lot blocks from the theater on State Street, and we bravely faced the winter winds as we made our way there on foot.  I remember being COLD.

School days were somewhat different.  On bitter cold days, girls were allowed to cover our legs, but only if we hung our Levi’s in our lockers when we arrived at school.  We may have added mufflers around our heads and necks to create just a little more warmth as we walked blocks and blocks to school in the morning, back home for lunch, then returning to school for the afternoon.

Looking back, I can’t help wondering why it never occurred to our parents to clothe us more warmly.  Weren’t they aware of the warmer winter clothing worn elsewhere?  One reason that we didn’t adopt warmer winter garb–like thermal underwear, or down jackets, or ski parkas–may have been a lack of awareness that they existed.  Or the answer may have been even simplerthe abundance of coal.

Inside, we were never cold.  Why?  Because heating with coal was ubiquitous.  It heated our apartment buildings, our houses, our schools, our stores, our movie theaters, our libraries, our public buildings, and almost everywhere else.  Radiators heated by coal hissed all winter long.  The result?  Overheated air.

Despite the bleak winter outside, inside I was never cold.  On the contrary, I was probably much too warm in the overheated spaces we inhabited.

Until I was 12, we lived in an apartment with lots of windows.  In winter the radiators were always blazing hot, so hot that we never felt the cold air outside.  The window glass would be covered in condensed moisture, a product of the intensely heated air, and I remember drawing funny faces on the glass that annoyed my scrupulous-housekeeper mother.

Where did all that heat come from?  I never questioned its ultimate source.

I later learned that it was extracted from deep beneath the earth.  But what happened to it above ground was no secret.  More than once, I watched trucks pull up outside my apartment building to deliver large quantities of coal.  The driver would set up a chute that sent the coal directly into the basement, where all those lumps of coal must have been shoveled into a big furnace.

Coal was the primary source of heat back then, and the environment suffered as a result.  After the coal was burned in the furnace, its ashes would be shoveled into bags.  Many of the ashes found their way into the environment.  They were, for example, used on pavements and streets to cope with snow and ice.

The residue from burning coal also led to other harmful results.  Every chimney spewed thick sooty smoke all winter, sending into the air the toxic particles that we all inhaled.

Coal was plentiful, cheap, and reliable.  And few people were able to choose alternatives like fireplaces and wood-burning furnaces (which presented their own problems).

Eventually, cleaner and more easily distributed forms of heating fuel displaced coal.  Residential use dropped, and according to one source, today it amounts to less than one percent of heating fuel.

But coal still plays a big part in our lives.  As Malcolm Turnbull, the former prime minister of Australia (which is currently suffering the consequences of climate change), wrote earlier this month in TIME magazine, the issue of “climate action” has been “hijacked by a toxic, climate-denying alliance of right-wing politics and media…, as well as vested business interests, especially in the coal industry.”  He added:  “Above all, we have to urgently stop burning coal and other fossil fuels.”

In her book Inconspicuous Consumption: the environmental impact you don’t know you have, Tatiana Schlossberg points out that we still get about one-third of our electricity from coal.  So “streaming your online video may be coal-powered.”  Using as her source a 2014 EPA publication, she notes that coal ash remains one of the largest industrial solid-waste streams in the country, largely under-regulated, ending up polluting groundwater, streams, lakes, and rivers across the country.

“As crazy as this might sound,” Schlossberg writes, watching your favorite episode of “The Office” might come at the expense of clean water for someone else.  She’s concerned that even though we know we need electricity to power our computers, we don’t realize that going online itself uses electricity, which often comes from fossil fuels.

Illinois is finally dealing with at least one result of its longtime dependence on coal.   Environmental groups like Earthjustice celebrated a big win in Illinois in 2019 when they helped win passage of milestone legislation strengthening rules for cleaning up the state’s coal-ash dumps.  In a special report, Earthjustice noted that coal ash, the toxic residue of burning coal, has been dumped nationwide into more than 1,000 unlined ponds and landfills, where it leaches into waterways and drinking water.

Illinois in particular has been severely impacted by coal ash.  It is belatedly overhauling its legacy of toxic coal waste and the resulting widespread pollution in groundwater near its 24 coal-ash dumpsites.  The new legislation funds coal-ash cleanup programs and requires polluters to set aside funds to ensure that they, not taxpayers, pay for closure and cleanup of coal-ash dumps.

Earthjustice rightfully trumpets its victory, which will now protect Illinois residents and its waters from future toxic pollution by coal ash.  But what about the legacy of the past, and what about the legacy of toxic coal particles that entered the air decades ago?

As an adult, I wonder about the huge quantities of coal dust I must have inhaled during every six-month-long Chicago winter that I lived through as a child.  I appear to have so far escaped adverse health consequences, but that could change at any time.

And I wonder about others in my generation.  How many of us have suffered or will suffer serious health problems as a result of drinking polluted water and inhaling toxic coal-dust particles?

I suspect that many in my generation have been unwilling victims of our decades-long dependence on coal.

 

 

Eating Dessert Can Help You Eat Better? Seriously?

I just celebrated my birthday with a scrumptious meal at a charming San Francisco restaurant. Sharing a fabulous candle-topped dessert with my companion was a slam-dunk way to end a perfect meal in a splendid restaurant.

Should I regret consuming that delicious dessert?

The answer, happily, is no.  I should have no regrets about eating my birthday surprise, and a recent study backs me up.

According to this study, published in the Journal of Experimental Psychology: Applied and reported in a recent issue of TIME magazine, having an occasional dessert may actually be a useful tool to help you eat better.

Here’s what happened:  More than 130 university students and staff were offered a choice of two desserts and asked to make their choice at the start of the lunch line in a campus cafeteria.  The study found that those who made the “decadent” selection—lemon cheesecake—chose healthier meals and consumed fewer calories overall than those who picked fresh fruit.  Simply selecting it first was enough to influence the rest of their order.

Almost 70 percent of those who picked the cheesecake went on to choose a healthier main dish and side dish, while only about a third of those selecting fruit made the healthier choice.  The cheesecake-choosers also ate about 250 fewer total calories during their meal compared with the fruit-choosers.

Study co-author Martin Reimann, an assistant professor of marketing and cognitive science at the University of Arizona, concluded that choosing something healthy first can give us a “license” to choose something less healthy later.  But if you turn that notion around and choose something more “decadent” early on, “then this license [to choose high-calorie food] has already expired.”  In other words, making a calorie-laden choice at the beginning of the meal seems to steer people toward healthier choices later.

No one is suggesting that we all indulge in dessert on an everyday basis.  For many of us, the pursuit of good health leads us to avoid sugary desserts and choose fresh fruit instead.  But Reimann believes that choosing dessert strategically can pay off.  He advises us to be “mindful and conscious about the different choices you make.”

Will I order lemon cheesecake, a chocolate brownie, or a spectacular ice-cream concoction for dessert at my next meal?  Probably not.  But I am going to keep the Arizona research in mind.

You should, too.  Beginning your meal with the knowledge that it could end with a calorie-laden dessert just might prompt you to select a super-healthy salad for your entrée, adding crunchy green veggies on the side.

 

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Let’s keep going as long as we can

One thing everyone can agree on:  Every single day, we’re all getting older.

But we don’t have to let that indisputable fact stop us from doing what we want to do.

I just came across a spectacular example of a 96-year-old scientist who keeps on going and going and going….

By sheer coincidence, he’s a man who’s worked for decades in the field of battery speed and capacity.  And he’s very much more than good enough to serve as an astounding example of enduring optimism and hard work.

A Wall Street Journal story in August profiled John Goodenough, who helped invent the lithium-ion battery that’s used to recharge cell phones and a host of other electronic products.  By introducing lithium cobalt oxide to the inner workings of batteries in 1980, he made batteries not only more powerful but also more portable.

At age 96, he now wants to kill off his own creation by removing the cobalt that allowed his battery to charge faster and last longer.  In April 2018, he and three co-authors published research that may lead to a new battery that’s liquid-free and cobalt-free.

Initial research shows that the new battery could potentially double the energy density of the lithium-ion battery.  That would mean that an electric car, for example, could drive twice as far on one charge.

“My mission is to try to see if I can transform the battery world before I die,” Dr. Goodenough says.  He added that he has no plans to retire.  “When I’m no longer able to drive and I’m forced to go into a nursing home, then I suppose I will be retiring.”

Goodenough works in an untidy office at the University of Texas in Austin, where he’s a professor of engineering.  He begins work between 8 and 8:30 a.m., leaves around 6 p.m., and works from home throughout the weekend.

He hand-writes his research and doesn’t own a cell phone, rejecting the mobile technology that his batteries made possible.  His car is a 10-year-old Honda that he hopes will last as long as he does.

His motivation is to help electric cars wean society off its dependence on the combustion engine, like the one in his Honda.

“He is driven by scientific curiosity, and he really wants to do something for society with the science he does,” says one of his colleagues, another engineering professor at UT, Arumugam Manthiram.

Isn’t it heartening to come across someone like John Goodenough, a remarkable human being who refuses to quit?

His story energizes me.  Although I’m considerably younger than Goodenough, it encourages me to pursue my passions no matter how old I get.

Does his story energize you, too?

 

[This blog post is somewhat shorter than usual because I’m currently in the midst of publishing my third novel, RED DIANA.  I’m hoping it will be available soon at bookstores everywhere and on Amazon.com.]

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

Who the Heck Knows?

I have a new catch phrase:  “Who the heck knows?”

I started using it last fall, and ever since then I’ve found that it applies to almost everything that might arise in the future.

I don’t claim originality, but here’s how I came up with it:

At a class reunion in October, I was asked to be part of a panel of law school classmates who had veered off the usual lawyer-track and now worked in a totally different area.

Specifically, I was asked to address a simple question:  Why did I leave my work as a lawyer/law professor and decide to focus primarily on writing?

First, I explained that I’d always loved writing, continued to write even while I worked as a lawyer, and left my law-related jobs when they no longer seemed meaningful.  I added that my move to San Francisco led to launching my blog and publishing my first two novels.

I concluded:

“If I stay healthy and my brain keeps functioning, I want to continue to write, with an increasing focus on memoirs….  I’ll keep putting a lot of this kind of stuff on my blog.  And maybe it will turn into a book or books someday.

“Who the heck knows?”

 

After I said all that, I realized that my final sentence was the perfect way to respond to almost any question about the future.

Here’s why it seems to me to apply to almost everything:

None of us knows what the next day will bring.  Still, we think about it.

In “Men Explain Things to Me,” the author Rebecca Solnit notes “that we don’t know what will happen next, and the unlikely and the unimaginable transpire quite regularly.”  She finds uncertainty hopeful, while viewing despair as “a form of certainty,” certainty that that “the future will be a lot like the present or will decline from it.”

Let’s cast certainty aside and agree, with Solnit, that uncertainty is hopeful.  Let’s go on to question what might happen in the uncertain future.

For example:

We wonder whether the midterm elections will change anything.

We wonder whether our kids will choose to follow our career choices or do something totally different.

We wonder whether our family history of a deadly disease will lead to having it ourselves.

We wonder whether to plan a trip to Peru.

We wonder whether we’re saving enough money for retirement.

We wonder how the U.S. Supreme Court will rule in an upcoming case.

We wonder what our hair will look like ten years from now.

We wonder what the weather will be like next week.

And we wonder what the current occupant of the White House will say or do regarding just about anything.

 

You may have an answer in mind, one that’s based on reason or knowledge or probability.   But if you’re uncertain…in almost every case, the best response is:  Who the heck knows?

If you’re stating this response to others, I suggest using “heck” instead of a word that might offend anyone.  It also lends a less serious tone to all of the unknowns out there, some of which are undoubtedly scary.

If you prefer to use a more serious tone, you can of course phrase things differently.

But I think I’ll stick with “Who the heck knows?”

Warning:  If you spend any time with me, you’ll probably hear me say it, again and again.

But then, who the heck knows?