Category Archives: health

Caffeine

I’m addicted.

I admit it.  I’m addicted to caffeine.

I find that I increasingly need caffeine.  It’s become an absolute necessity.  I drink 3 to 4 cups of coffee from about 8 a.m. till about 4 or 5 p.m. Why?  Because I like it.  And because it helps me stay awake when I need to be.

First, a little bit about my relationship to caffeine. 

I remember how my mother drank coffee all day long.  Once I asked her if I could taste it.  I figured that it had to be delicious or she wouldn’t drink so much of it.  So when she said I could taste it, I took a sip.  Yuck!  It tasted terrible.

I didn’t try coffee again until my first year of college, when I discovered that it was drinkable if I put enough milk and sugar in it.  I decided to try it when late-night studying began to take its toll.  I found I’d doze off in class the minute the professor turned off the lights and showed slides on a screen at the front of the classroom.  But I discovered that if I had some caffeine in my breakfast coffee, I could stay awake.

As I’ve gotten older, I’ve found that consuming caffeine is a necessity.  Especially before sitting in a theater, when (as in college classrooms) the lights are dimmed and I need to stay conscious to enjoy a film, a play, a concert, a ballet performance, or an opera.  Although the pandemic has cramped my style, suspending my theater-going, for example, I’ve continued to rely on caffeine while I read or watch TV at home.

Now let’s look at some of the science behind caffeine.  I won’t bore you with the wonkiest stuff, but you probably want to know something about it.

I found this info in the March 2021 issue of Nutrition Action, a monthly newsletter published by the Center for Science in the Public Interest (CSPI), my go-to source for honest reporting on healthy food choices and the like.  Here’s a summary of the most useful info:

How does caffeine work?  It blocks adenosine receptors in the brain.  Huh?  What’s adenosine?

Adenosine is a natural sedative.  When it builds up, you feel drowsy.  But when caffeine blocks it, you don’t.

But watch out:  You can build up a tolerance to caffeine.  What happens is this:  The more caffeine you consume, the more adenosine receptors your brain makes.  So you need even more caffeine to block those extra receptors and keep you alert.

But how much is too much?  The FDA says that most adults can safely consume up to 400 milligrams a day.  This is roughly the amount in two large cups of coffee at Starbucks or Dunkin’ Donuts.  But the amount of caffeine in your home-brewed coffee can vary.  And caffeine’s impact on people varies.

So you need to judge the impact it has on you.  If you’re having trouble sleeping, or too much coffee makes you feel jittery, you probably need to cut back on how much you imbibe, and pay attention to when you’re imbibing.

You can try to break up with coffee, as famed author Michael Pollan has.  He reports “sleeping like a teenager” and waking “feeling actually refreshed.”  But that experience may not work for everyone.

One study asked 66 young caffeine users–who were having trouble sleeping–to go “cold turkey.”  But during the a week with no caffeine, they spent no more time asleep and took no less time to fall asleep than before. 

Still, it’s probably wise to avoid caffeine right before bed.  Studies show that people generally take longer to fall asleep and get less deep sleep when they have caffeine right before bedtime.

Coffee consumption has shown some real benefits.  A lower risk of type 2 diabetes, for one thing.  Better exercise-performance for another.  (Although few studies have looked at the exercise-boosting effect in older adults, one study of 19 Brits aged 61 to 79 showed that they performed better in a battery of physical tests after they consumed caffeine.)  Finally, studies have shown that people who consume more caffeine have a lower risk of Parkinson’s disease.

I get my caffeine in a variety of sources, including coffee, tea, and cola drinks. I also happily consume coffee candy (my favorite is Caffe Rio, available at Trader Joe’s) and coffee ice cream.  I also heartily recommend the cappuccino gelato at my local gelato shop.  But let’s face it:  a cup of coffee packs the most punch.

The recent advent of cold brew coffee allows coffee-drinkers to get their caffeine in a less acidic form.  According to one source, cold brew is over 67 percent less acidic than hot brewed coffee because the coffee grounds aren’t exposed to high temperatures.  Result:  cold brew appeals to some of us because it’s sweeter, smoother, and less bitter. (But don’t confuse it with iced coffee, which has the same acidity as regular hot coffee.  The ice can dilute it, however.)  I’ve tried cold brew and like it.  I keep a bottle of it in my fridge and frequently drink some.  But it’s much pricier than my home brew, at least for now.

New sources have popped up.  One may be bottled water.  In the bargain bin at a local supermarket, I once came across a bottle of Sparking Avitae, whose label states that it’s caffeine plus water and natural fruit flavors.  It claims to have “about the same amount of caffeine as a cup of coffee,” thereby giving you “instant go with added fizz.” According to the manufacturer, it includes “natural caffeine derived from green coffee beans.”  I’m not sure this product is still available.  Possibly something like it is.  My original purchase is stashed in my fridge, but I’ve never tried it.

Even newer:  I recently spied an ad for a cosmetic product called “Eyes Open Caffeine and Peptide Eye Cream.”   Yes, eye cream.  This one claims to be “supercharged with caffeine,” adding that it can “reduce the appearance of puffiness and dark circles.”  Does it work?  Who knows?  I’d guess that it probably works just about as well as any other eye cream.  Dermatologists generally tell their patients not to expect very much from any of them, no matter their price or their claims. 

To sum up, I confess that I ally with Abbie Hoffman, the “Chicago 7” trial defendant.  When the prosecutor asked him whether he was addicted to any drug, Abbie said “Yes.”  Which one?  “Caffeine.”   [Please see Post #9 in my blog series, “Hangin’ with Judge Hoffman,” published on 4/20/21, where I noted this amusing bit of testimony.]

My favorite coffee mug says it all:  Its vintage photo features a stylish woman in glamorous riding gear, holding the reins of her horse, saying “You can lead a horse to water…but I could use a triple expresso.”

And let’s not forget my sticky-note pad featuring a stylishly-coiffed woman, circa 1928, drinking what’s clearly a cup of coffee.  She boldly announces:  “Given enough coffee, I could rule the world.”  

Well, maybe coffee-drinkers like me should actually try to rule the world.  We might do a better job than most of those who’ve been in charge.

Okay.  I’m addicted.  And my path ahead is clear. 

I’ll continue to reap the benefits of caffeine while at the same time I steer away from any potentially harmful impact.

Maybe you’d like to join me on this path?

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Coal: A Personal History

It’s January, and much of the country is confronting freezing temperatures, snow, and ice.  I live in San Francisco now, but I vividly remember what life is like in cold-weather climates.

When I was growing up on the North Side of Chicago, my winter garb followed this pattern:

Skirt and blouse, socks (usually short enough to leave my legs largely bare), a woolen coat, and a silk scarf for my head.  Under my coat, I might have added a cardigan sweater.  But during the freezing cold days of winter (nearly every day during a normal Chicago winter), I was always COLD—when I was outside, that is.

My parents were caring and loving, but they followed the norms of most middle-class parents in Chicago during that era.  No one questioned this attire.  I recall shivering whenever our family ventured outside for a special event during the winter.  I especially remember the excitement of going downtown to see the first showing of Disney’s “Cinderella.”  Daddy parked our Chevy at an outdoor parking lot blocks from the theater on State Street, and we bravely faced the winter winds as we made our way there on foot.  I remember being COLD.

School days were somewhat different.  On bitter cold days, girls were allowed to cover our legs, but only if we hung our Levi’s in our lockers when we arrived at school.  We may have added mufflers around our heads and necks to create just a little more warmth as we walked blocks and blocks to school in the morning, back home for lunch, then returning to school for the afternoon.

Looking back, I can’t help wondering why it never occurred to our parents to clothe us more warmly.  Weren’t they aware of the warmer winter clothing worn elsewhere?  One reason that we didn’t adopt warmer winter garb–like thermal underwear, or down jackets, or ski parkas–may have been a lack of awareness that they existed.  Or the answer may have been even simplerthe abundance of coal.

Inside, we were never cold.  Why?  Because heating with coal was ubiquitous.  It heated our apartment buildings, our houses, our schools, our stores, our movie theaters, our libraries, our public buildings, and almost everywhere else.  Radiators heated by coal hissed all winter long.  The result?  Overheated air.

Despite the bleak winter outside, inside I was never cold.  On the contrary, I was probably much too warm in the overheated spaces we inhabited.

Until I was 12, we lived in an apartment with lots of windows.  In winter the radiators were always blazing hot, so hot that we never felt the cold air outside.  The window glass would be covered in condensed moisture, a product of the intensely heated air, and I remember drawing funny faces on the glass that annoyed my scrupulous-housekeeper mother.

Where did all that heat come from?  I never questioned its ultimate source.

I later learned that it was extracted from deep beneath the earth.  But what happened to it above ground was no secret.  More than once, I watched trucks pull up outside my apartment building to deliver large quantities of coal.  The driver would set up a chute that sent the coal directly into the basement, where all those lumps of coal must have been shoveled into a big furnace.

Coal was the primary source of heat back then, and the environment suffered as a result.  After the coal was burned in the furnace, its ashes would be shoveled into bags.  Many of the ashes found their way into the environment.  They were, for example, used on pavements and streets to cope with snow and ice.

The residue from burning coal also led to other harmful results.  Every chimney spewed thick sooty smoke all winter, sending into the air the toxic particles that we all inhaled.

Coal was plentiful, cheap, and reliable.  And few people were able to choose alternatives like fireplaces and wood-burning furnaces (which presented their own problems).

Eventually, cleaner and more easily distributed forms of heating fuel displaced coal.  Residential use dropped, and according to one source, today it amounts to less than one percent of heating fuel.

But coal still plays a big part in our lives.  As Malcolm Turnbull, the former prime minister of Australia (which is currently suffering the consequences of climate change), wrote earlier this month in TIME magazine, the issue of “climate action” has been “hijacked by a toxic, climate-denying alliance of right-wing politics and media…, as well as vested business interests, especially in the coal industry.”  He added:  “Above all, we have to urgently stop burning coal and other fossil fuels.”

In her book Inconspicuous Consumption: the environmental impact you don’t know you have, Tatiana Schlossberg points out that we still get about one-third of our electricity from coal.  So “streaming your online video may be coal-powered.”  Using as her source a 2014 EPA publication, she notes that coal ash remains one of the largest industrial solid-waste streams in the country, largely under-regulated, ending up polluting groundwater, streams, lakes, and rivers across the country.

“As crazy as this might sound,” Schlossberg writes, watching your favorite episode of “The Office” might come at the expense of clean water for someone else.  She’s concerned that even though we know we need electricity to power our computers, we don’t realize that going online itself uses electricity, which often comes from fossil fuels.

Illinois is finally dealing with at least one result of its longtime dependence on coal.   Environmental groups like Earthjustice celebrated a big win in Illinois in 2019 when they helped win passage of milestone legislation strengthening rules for cleaning up the state’s coal-ash dumps.  In a special report, Earthjustice noted that coal ash, the toxic residue of burning coal, has been dumped nationwide into more than 1,000 unlined ponds and landfills, where it leaches into waterways and drinking water.

Illinois in particular has been severely impacted by coal ash.  It is belatedly overhauling its legacy of toxic coal waste and the resulting widespread pollution in groundwater near its 24 coal-ash dumpsites.  The new legislation funds coal-ash cleanup programs and requires polluters to set aside funds to ensure that they, not taxpayers, pay for closure and cleanup of coal-ash dumps.

Earthjustice rightfully trumpets its victory, which will now protect Illinois residents and its waters from future toxic pollution by coal ash.  But what about the legacy of the past, and what about the legacy of toxic coal particles that entered the air decades ago?

As an adult, I wonder about the huge quantities of coal dust I must have inhaled during every six-month-long Chicago winter that I lived through as a child.  I appear to have so far escaped adverse health consequences, but that could change at any time.

And I wonder about others in my generation.  How many of us have suffered or will suffer serious health problems as a result of drinking polluted water and inhaling toxic coal-dust particles?

I suspect that many in my generation have been unwilling victims of our decades-long dependence on coal.

 

 

Eating Dessert Can Help You Eat Better? Seriously?

I just celebrated my birthday with a scrumptious meal at a charming San Francisco restaurant. Sharing a fabulous candle-topped dessert with my companion was a slam-dunk way to end a perfect meal in a splendid restaurant.

Should I regret consuming that delicious dessert?

The answer, happily, is no.  I should have no regrets about eating my birthday surprise, and a recent study backs me up.

According to this study, published in the Journal of Experimental Psychology: Applied and reported in a recent issue of TIME magazine, having an occasional dessert may actually be a useful tool to help you eat better.

Here’s what happened:  More than 130 university students and staff were offered a choice of two desserts and asked to make their choice at the start of the lunch line in a campus cafeteria.  The study found that those who made the “decadent” selection—lemon cheesecake—chose healthier meals and consumed fewer calories overall than those who picked fresh fruit.  Simply selecting it first was enough to influence the rest of their order.

Almost 70 percent of those who picked the cheesecake went on to choose a healthier main dish and side dish, while only about a third of those selecting fruit made the healthier choice.  The cheesecake-choosers also ate about 250 fewer total calories during their meal compared with the fruit-choosers.

Study co-author Martin Reimann, an assistant professor of marketing and cognitive science at the University of Arizona, concluded that choosing something healthy first can give us a “license” to choose something less healthy later.  But if you turn that notion around and choose something more “decadent” early on, “then this license [to choose high-calorie food] has already expired.”  In other words, making a calorie-laden choice at the beginning of the meal seems to steer people toward healthier choices later.

No one is suggesting that we all indulge in dessert on an everyday basis.  For many of us, the pursuit of good health leads us to avoid sugary desserts and choose fresh fruit instead.  But Reimann believes that choosing dessert strategically can pay off.  He advises us to be “mindful and conscious about the different choices you make.”

Will I order lemon cheesecake, a chocolate brownie, or a spectacular ice-cream concoction for dessert at my next meal?  Probably not.  But I am going to keep the Arizona research in mind.

You should, too.  Beginning your meal with the knowledge that it could end with a calorie-laden dessert just might prompt you to select a super-healthy salad for your entrée, adding crunchy green veggies on the side.

 

Do you ever find yourself saying things your parents said?

Do you ever find yourself saying things your parents said?

Maybe your father used some phrases you’ve caught yourself saying.  Because my father died when I was 12, I can’t recall any pet phrases he used, so I have none to repeat.

But my mother, who died when I was decades older–that’s a different story.

At the outset, you should know that Mom was very smart.  She yearned to go to college and become a teacher, but after her father died, her family didn’t have enough money to send her and both of her brothers to college. I’m sure you can guess the outcome.

Mom had many pet phrases.  More and more, I hear myself repeating them.  But not all of them.

Here are some of Mom’s best, along with the context that surrounds them:

 

One of Mom’s favorites was “Before you know it.”  She usually said it when we’d talk about something we expected to happen in the future.  For example, when we talked about a young child going off to college someday, she’d frequently say, ”Before you know it….”  Or when, in the dead of winter, we talked about how far away summer seemed, she’d say, “Before you know it…”  Her instincts about how rapidly the future would arrive were usually right.  Now I often repeat that phrase myself.

When Mom conceded that something wasn’t just right, she’d often add, “Still and all.”  I can hear her saying it over and over again.  The dictionary defines the phrase as meaning “nevertheless” or “even so.”  Although you don’t hear many people use it, still and all it’s a great phrase.  Maybe more of us could use it.

When Mom liked to be very sure of something, she’d tell me that she wanted to “make doubly sure.”  I love that phrase and really must remember to use it whenever it fits.

 

Mom had definite views about gender and gender roles. They were typical of her era, so I give her a pass on some of them. But not all. These phrases frequently annoyed me, especially as I grew older and much more wary of gender stereotypes.

For example, I’ve written previously about how she admonished my sister and me to act “lady-like.”  I’m sure she thought that was the appropriate behavior for girl children.  But although the phrase didn’t bother me when I was younger, it later began to irritate me, especially when I had two daughters of my own, and the term “lady” assumed connotations I disagreed with.  But I don’t think Mom ever changed her thinking on that.

Her views on boys were distinctly different and bordered on stereotypical.

When a little kid acted up in her presence (and it was generally a boy), she’d refer to him as a “holy terror.”  She rarely referred to rambunctious girls that way.  But she might have.  (The prime example: My older sister, who later in life self-diagnosed as being a hyperactive child.  I know her behavior often created problems for my parents.)

Mom would frequently describe little boys she encountered as “all boy.”  I’m not really sure what she meant.  And as the mother of two daughters (as she was), her choice of words always struck me as rather strange.  Were girls ever “all girl?”  When?  Why?  And what made boys “all boy” to begin with?  I never challenged her on her use of this term and would just let it go.  But it still makes me wonder how she came up with it.

 

Let’s leave the gender issue for now and move on to the weather.

Living in Chicago, where we constantly faced extremes of heat and cold, most of us welcomed a warmer day that came along in late winter.  But Mom would often say, “It’s almost too warm.”  I guess she found the occasional warm day somewhat jarring in the middle of a cold spell.  But I was always delighted by that sort of change in the weather, and that phrase often made me laugh.

 

Now, on to the subject of time.

When we traveled, especially when we were driving somewhere in a car, Mom always relished “making good time.”  She meant that we were getting to our destination efficiently!  An admirable phrase, no?

But on other occasions she’d say, “Slow down.  We’ve got nothing but time.”  I generally disagreed with this point of view.  Always pursuing one goal or another, I’ve never felt I had “nothing but time.”  Quite the opposite.  And I’m afraid I still have the same outlook today.  But…maybe Mom was right, and I should slow down!

Slowing down might keep me from meeting some of my goals, but it would probably benefit my health.  I should keep in mind that one of my favorite Simon and Garfunkel songs begins this way:  “Slow down, you move too fast.  You got to make the morning last.”  Thanks, Paul Simon.  Mom definitely agreed with your thinking.

Speaking of “time,” Mom also liked to say that someone who wasn’t moving fast enough was “taking her sweet time.”  An example would be an employee in a retail store who helped customers in a poky fashion.  I sometimes think of that phrase when I see a pedestrian sauntering slowly across a busy intersection–sometimes looking at a cell phone instead of the traffic.  I’m often a pedestrian myself, and I resent careless drivers who barely let me cross an intersection safely before they make their turns.  (And I move fast.)  But when I’m driving, I find “saunterers” annoying.  They’re taking their sweet time!

I don’t think I ever encountered the “sweet time” phrase anywhere else…until I recently came across it in a short story, “Something to Remember Me By,” written by Nobel-prize-winning author Saul Bellow.  The narrator describes a character he’s watching this way:  “she simply took her sweet time about everything….”

That Mom and Saul Bellow used the same phrase doesn’t strike me as bizarre (as it might strike you) because the two of them were close in age, grew up in the same neighborhood on the northwest side of Chicago (Humboldt Park, to be precise), and attended the same public high school.  Mom sometimes told me that she knew the Bellow family.  So when Bellow published Humboldt’s Gift (which I confess I’ve never read), I figured he chose the name Humboldt because of his origins in that neighborhood.  Maybe everyone who grew up there during that era also used the “sweet time” phrase.

 

Mom found certain things disturbing.  She and my father always followed politics, perhaps inspiring my lifelong interest in the political scene.  But Mom could get “all worked up” when things didn’t strike her the right way.  A devotee of daily newspapers and local TV news, she continued to follow politics into her 90s.  But she increasing got “all worked up” when she listened to officeholders orating on TV, stating policies she disagreed with.

Although I never used this phrase in the past, it resonates with me more and more. If I don’t hit the mute button fast enough and inadvertently hear the current occupant of the White House or his cohorts speaking on TV, I can easily get all worked up.

 

Other things that disturbed Mom made her feel “sick at heart.”  I haven’t used that phrase, but maybe I should.  It reflects the reality that disturbing events can make us feel deeply troubled, even affecting our physical well-being.

 

Switching topics:  When I would go shopping with Mom, usually on State Street in downtown Chicago (she always called that part of town “the Loop”), Mom’s admonitions came fast and furious.  A favorite was “Watch your purse!”  So from the time I was old enough to carry my own handbag, I would clutch it close to me.  The irony is that I never was a victim, but one day a thief opened Mom’s handbag on a CTA bus, and her wallet disappeared.  I remember collecting the wallet for Mom at the Woolworth’s store on State Street when it somehow turned up, money extracted.

In a way, this outcome wasn’t terribly surprising.  Despite her fear of thievery, Mom would carry the kind of handbag that could easily be opened.  Held over her arm the way the Queen of England invariably holds hers, it had the kind of clasp that could be flipped open in a millisecond.  I’ve always preferred shoulder bags with zipper closures that I can hold next to my body, making them difficult to pilfer.  Now I frequently wear crossbody bags that discourage thievery even more.

Another downtown phrase:  In the enormous women’s restroom on the 3rd floor (or was it the 4th?) of Marshall Field’s vast State Street Store, Mom would always say “Flush with your foot!”  I guess the toilets were the kind that featured a flushing mechanism one could operate that way.  Mom’s concern with bacteria was always front and center.

 

This concern related to household matters:  When I was older and my family and I had our own home, Mom would frequently visit us there.  She almost always made clear that she disapproved of my housekeeping (which admittedly has–throughout my lifetime–been abysmal).  Mom would offer to help, but as she got older, I wouldn’t let her do anything.  Accustomed to doing her own household chores with tremendous zeal, she would throw up her hands (figuratively), and after a while she’d tell me that she was “tired from sitting.”

Mom may have been onto something.  Research has shown that simply sitting is in fact unhealthy.  Mom’s instincts were right.

Mom also insisted that my daughters help me with household chores.  She would often tell them, “You can’t be lazy.”  This phrase relates to another literary reference:  In a story written by Nobel-prize-winning author Isaac Bashevis Singer (published in a collection of stories titled The Power of Light), Singer sets the scene in an old-world home. He quotes an elder who explains his view of miracles:  “The truth is that miracles were rare in all times.  If too many miracles occurred, people would rely on them too much.  Free choice would cease.  The Powers on High want [people] to do things, make an effort, not to be lazy.”

So it seems that Mom was borrowing the wisdom of the elders when she told us not to be lazy.

Today, my older daughter and I repeat Mom’s phrase to her two daughters, my delightful granddaughters.  Like Cinderella’s stepsisters, they would prefer to lie abed and have someone else do things like laundry and straightening up.  Let’s face it, I’m very much of the same mind.  I do as little as possible to make my home neat and tidy.

But Mom’s phrase often comes back to haunt me, and I remind myself, as well as my granddaughters, that you can’t be lazy!

 

So…when you find yourself repeating phrases your parents liked to use, remember that a great many of them have stood the test of time and can be repeated today, as well as in their day, with the same positive effect.

Don’t be reluctant to use those phrases in your own conversation.  They may sometimes seem old-fashioned, no longer worth repeating because they’re out of date.

Still and all…they may say exactly what you want to say.

And before you know it, our kids will be doing the very same thing.

 

 

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

Another Benefit of Progeny

Being a grandparent?  It’s wonderful.  And I just learned about a benefit of spending time with my grandkids that I never knew.  What’s more, you don’t even have to be a grandparent to share this benefit with me.

If you’re lucky and you already have a grandchild, congrats!  Grandparenthood is an extraordinarily good thing.  Free of the challenges of parenthood, you’ve plunged into a whole new shimmering world. And unless you’ve had to assume parent-like responsibility for your grandchild, you’ll relish the many rewards you’re now entitled to enjoy.

Spending a day with my grandchildren is my idea of a perfectly splendid day.

(By the way, I don’t call it “babysitting”!  I view babysitting as a paid job—a job I did to earn money in my younger days.  By contrast, spending time with my grandkids is a joyful pursuit I welcome doing.)

It’s not always easy to become a grandparent.  We all know you can’t make a grandchild appear with the wave of a magic wand.

First, you need to have a grown child or two.  Next, that child must want to have a child or two of his or her own.  (Let’s just say her.)

That child must be able to produce her own child.  Several routes now make that possible: the old-fashioned way; new ways to conceive and give birth, thanks to medical science; adoption; or becoming a step-parent.  (If you know of any other ways to produce a child, please let me know.)

Sometimes you can wait a long time.  A savvy parent doesn’t ask questions and doesn’t offer advice.  You need to be patient and let your child achieve parenthood whenever and however it works for her.

If, at last, your child has a child of her own, you are now officially a grandparent.

I’ve been lucky to have two exceptional daughters who both have children of their own.  And I delight in their company.

But even though I’ve always reveled in my role as a granny, empirical research has now uncovered a wonderful bonus:  It seems that spending time with your grandkids can significantly lower your risk of dying sooner rather than later.

A research study, published in May 2017 in Evolution & Human Behavior, concluded that caregiving both within and beyond the family is associated with lower mortality for the caregiver. This heartening conclusion seems to apply to every caregiver, grandparent or not.

The researchers from Switzerland, Germany, and Australia looked at data collected over two decades and focused specifically on grandparents. They concluded that “mortality hazards” for grandparents who provided childcare were 37% lower than for grandparents who did not.

Half of the caregiving grandparents lived for about 10 years after they were first interviewed for the highly-respected Berlin Aging Study, while half of those grandparents who did not provide childcare died within 5 years. These results held true even when the researchers controlled for such factors as physical health, age, and socioeconomic status.

What about non-grandparents and childless older adults?  The positive effects of caregiving also extended to them–if they acted as caregivers in some way.  For example, older parents who had no grandchildren but provided practical help to their adult children also lived longer than those who didn’t.

The results of this study are even more significant than they might have been in the past.  According to the federal government’s latest scorecard on aging, there’s been a drop in overall life expectancy. If your goal is to stick around as long as possible, you might want to think about providing care to others, even if they aren’t your own kids or grandkids.

No kids?  No grandkids?  Here’s my suggestion: Enhance your longevity by becoming a grandparent-surrogate.  Even if you think you might have a child or grandchild of your own someday, why not offer to spend time with other people’s kids or grandkids right now?

If you do, you can expect to see little faces light up when you arrive on the scene.  Parents will be forever grateful, and you’ll probably have lots of fun.

How long will each of us live?  Who the heck knows!  But you might as well do what you can to prolong your life.  Spending time with children and grandchildren, your own or others’, is a jim-dandy way to do it.

 

A new book you may want to know about

There’s one thing we can all agree on:  Trying to stay healthy.

That’s why you may want to know about a new book, Killer diseases, modern-day epidemics:  Keys to stopping heart disease, diabetes, cancer, and obesity in their tracks, by Swarna Moldanado, PhD, MPH, and Alex Moldanado, MD.

In this extraordinary book, the authors have pulled together an invaluable compendium of both evidence and advice on how to stop the “killer diseases” they call “modern-day epidemics.”

First, using their accumulated wisdom and experience in public health, nursing science, and family medical practice, Swarna and Alex Moldanado offer the reader a wide array of scientific evidence.  Next, they present their well-thought-out conclusions on how this evidence supports their theories of how to combat the killer diseases that plague us today.

Their most compelling conclusion:  Lifestyle choices have an overwhelming impact on our health.  So although some individuals may suffer from diseases that are unavoidable, evidence points to the tremendous importance of lifestyle choices.

Specifically, the authors note that evidence “points to the fact that some of the most lethal cancers are attributable to lifestyle choices.”  Choosing to smoke tobacco or consume alcohol in excess are examples of the sort of risky lifestyle choices that can lead to this killer disease.

Similarly, cardiovascular diseases–diseases of the heart and blood vessels–share many common risk factors.  Clear evidence demonstrates that eating an unhealthy diet, a diet that includes too many saturated fats—fatty meats, baked goods, and certain dairy products—is a critical factor in the development of cardiovascular disease. The increasing size of food portions in our diet is another risk factor many people may not be aware of.

On the other hand, most of us are aware of the dangers of physical inactivity.  But knowledge of these dangers is not enough.  Many of us must change our lifestyle choices.  Those of us in sedentary careers, for example, must become much more physically active than our lifestyles lend themselves to.

Yes, the basics of this information appear frequently in the media.  But the Moldanados reveal a great deal of scientific evidence you might not know about.

Even more importantly, in Chapter 8, “Making and Keeping the Right Lifestyle Choices,” the authors step up to the plate in a big way.  Here they clearly and forcefully state their specific recommendations for succeeding in the fight against killer diseases.

Following these recommendations could lead all of us to a healthier and brighter outcome.

Kudos to the authors for collecting an enormous volume of evidence, clearly presenting it to us, and concluding with their invaluable recommendations.

No more excuses!  Let’s resolve to follow their advice and move in the right direction to help ensure our good health.

 

 

 

 

Pockets!

Women’s clothes should all have pockets. 

(A bit later in this post, I’ll explain why.)

I admit it.  I’m a pocket-freak.

When I shop for new pants, I don’t bother buying new pants, no matter how appealing, if they don’t have pockets.  Why?

Because when I formerly bought pants that didn’t have pockets, I discovered over time that I never wore them. They languished forever in a shameful pile of unworn clothes.

It became clear that I liked the benefits of wearing pants with pockets.  Why then would I buy new pants without pockets when those I already had were languishing unworn?

Result:  I simply don’t buy no-pocket pants anymore

Most jeans have pockets, often multiple pockets, and I like wearing them for that reason, among others.  (Please see “They’re My Blue Jeans, and I’ll Wear Them If I Want To,” published in this blog in May 2017.)

Most jackets, but not all, have pockets.  Why not?  They all need pockets.  How useful is a jacket if it doesn’t have even one pocket to stash your stuff?

Dresses and skirts should also have pockets.  Maybe an occasional event, like a fancy gala, seems to require a form-fitting dress that doesn’t have pockets.  But how many women actually go to galas like that?  Looking back over my lifetime of clothes-wearing, I can think of very few occasions when I had to wear a no-pocket dress.  As for skirts, I lump them in the same category as pants.  Unless you feel compelled for some bizarre reason to wear a skin-tight pencil skirt, what good is a skirt without pockets?

Cardigan sweaters, like jackets, should also have pockets.  So should robes.  Pajamas. Even nightgowns.  I wear nightgowns, and I relish being able to stick something like a facial tissue into the pocket of my nightgown!   You never know when you’re going to sneeze, right?

Did you ever watch a TV program called “Project Runway?”  It features largely unknown fashion designers competing for approval from judges, primarily high-profile insiders in the fashion industry.  Here’s what I’ve noticed when I’ve watched an occasional episode:  Whenever a competing designer puts pockets in her or his designs, the judges enthusiastically applaud that design.  They clearly recognize the value of pockets and the desire by women to wear clothes that include them.

(By the way, fake pockets are an abomination.  Why do designers think it’s a good idea to put a fake pocket on their designs?  Sewing what looks like a pocket but isn’t a real pocket adds insult to injury.  Either put a real pocket there, or forget the whole thing.  Fake pockets?  Boo!)

Despite the longing for pockets by women like me, it can be challenging to find women’s clothes with pockets.  Why?

Several women writers have speculated about this challenge, generally railing against sexist attitudes that have led to no-pocket clothing for women.

Those who’ve traced the evolution of pockets throughout history discovered that neither men nor women wore clothing with pockets until the 17th century.  Pockets in menswear began appearing in the late 1600s.  But women?  To carry anything, they were forced to wrap a sack with a string worn around their waists and tuck the sack under their petticoats.

These sacks eventually evolved into small purses called reticules that women would carry in their hands.  But reticules were so small that they limited what women could carry.  As the twentieth century loomed, women rebelled.  According to London’s Victoria and Albert Museum, dress patterns started to include instructions for sewing pockets into skirts.  And when women began wearing pants, they would finally have pockets.

But things soon switched back to no-pocket pants.  The fashion industry wasn’t a big fan of pockets, insisting on featuring “slimming” designs for women, while men’s clothes still had scads of pockets.  The result has been the rise of bigger and bigger handbags (interestingly, handbags are often called “pocketbooks” on the East Coast).

Enormous handbags create a tremendous burden for women.  Their size and weight can literally weigh a woman down, impeding her ability to move through her busy life the way men can.  (I’ve eschewed bulky handbags, often wearing a backpack instead.  Unfortunately, backpacks are not always appropriate in a particular setting.)

Today, many women are demanding pockets.  Some have advocated pockets with the specific goal of enabling women to carry their iPhones or other cell phones that way.  I’m a pocket-freak, but according to recent scientific research, cell phones emit dangerous radiation, and this kind of radiation exposure is a major risk to your health.  Some experts in the field have therefore advised against keeping a cell phone adjacent to your body.  In December 2017, the California Department of Public Health specifically warned against keeping a cell phone in your pocket.  So, in my view, advocating pockets for that reason is not a good idea.

We need pockets in our clothes for a much more important and fundamental reasonFreedom.

Pockets give women the kind of freedom men have:  The freedom to carry possessions close to their bodies, allowing them to reach for essentials like keys without fumbling through a clumsy handbag.

I propose a boycott on no-pocket clothes.  If enough women boycott no-pocket pants, for example, designers and manufacturers will have to pay attention.  Their new clothing lines will undoubtedly include more pockets.

I hereby pledge not to purchase any clothes without pockets.

Will you join me?