Category Archives: disease

Lipstick, Then and Now

Let’s talk about lipstick.

Lipstick?

I know what you’re thinking.  Lipstick is not the weightiest topic I could be writing about.  But it’s a pretty good reflection of how our lives have changed since March.

A few years ago, I wrote about something I called “The Lip-Kick Effect.”  At the time, we were working our way out of a financial recession, and many Americans still felt stuck in neutral or worse.  I wondered:  How do we cope?  By buying more…lipstick?

The improbable answer was “Yes.”  Researchers had concluded that the more insecure the economy, the more women tended to spend on beauty products, especially lipstick.  They dubbed this phenomenon the “lipstick effect.”

(I preferred to call it the “lip-kick effect.”  When one of my daughters was quite small, she pronounced “lipstick” as “lip-kick,” and her mispronunciation struck me as an even better moniker for the “lipstick effect.”)

Five separate studies confirmed this hypothesis.  They found that during recessions over the previous 20 years, women had reallocated their spending, deciding to spend their money on beauty products instead of other items.

Why did women confronted with economic hardship seek out new beauty products?  The researchers came up with a host of reasons.  Most significant: a desire to attract men, especially men with money.

Another reason?  Wearing lipstick could boost a woman’s morale.

In that blissful time BC (before Covid-19), I cheerfully admitted that I was a (credit-)card-carrying member of the latter group.  Like many women, I got a kick out of wearing lipstick.  I added that “while uncertainty reigns, we women get our kicks where we can.”

Believing that a brand-new lipstick could be a mood-changer, I bought into the notion that lipstick could make women feel better.  And lipstick was a pretty cheap thrill.  For just a few dollars, I could head to my local drugstore and choose from scores of glittering options.

That was then.  This is now.  A very different now.

In 2020, lipstick has become expendable.  If you’re still staying-at-home, sheltering-in-place, or whatever you choose to call it, most makeup has become expendable.

By April, I had pretty much given up wearing lipstick.  When I wrote about wearing scarves as face-coverings, I added:  “One more thing I must remember before I wrap myself in one of my scarves:  Forget about lipstick.  Absolutely no one is going to see my lips, and any lip color would probably rub off on my scarf.”  [https://susanjustwrites.wordpress.com/2020/04/06/join-the-ranks-of-the-scarf-wearers/]

The same goes, of course, for masks.

A former believer in the lip-kick effect, I now gaze at my collection of colorful lipsticks and immediately dismiss the idea of applying one to my lips.  I’m not alone.  When many of us decided to adopt masks and other face-coverings, sales of lip products fell.  As a market research analyst noted, “Nobody wants lipstick smudges inside their masks” (quoted in The Washington Post on June 15th).  Today, as cases of coronavirus spike in many parts of the country, there’s an increasing urgency to wearing masks, even legal requirements to do so.

I wear a mask or scarf whenever I leave home.  Now, viewing my wide array of all sorts of makeup, I primarily focus on sunscreen and other products that protect my skin when I take my daily stroll.

Instead of lipstick, I’ll apply a lip balm like Burt’s Bees moisturizing lip balm.  For the tiniest bit of color, I might add “lip shimmer.”  But neither of these has the look or feel of a true lipstick.  The kind I used to view as a morale-booster.

For a boost in morale, I now rely on sunshine and the endorphins produced by my brisk walking style.

Wearing lipstick right now?  Forgeddaboutit.….

Now let’s think about lipstick in a new light.  When a vaccine is proven to be safe and effective, and a vanishing pandemic no longer dictates the wearing of face-coverings like masks, will women return to adding color to our lips?  Will we enthusiastically rush to retail establishments that offer an array of enticing new lipsticks?

The answer, for now, is unclear.  Many women, adopting the almost universally accepted cultural norm that lipstick will make them more attractive to others, may happily put their dollars down to buy those bright tubes of color again.  Some women may continue to view wearing lipstick as a morale-booster.  But others, after some contemplation, may decide that buying lipstick and other types of makeup isn’t where we should direct our hard-earned cash.

Maybe at least some of our dollars are more usefully directed elsewhere:  To help our neediest fellow citizens; to bolster causes that promote long-sought equity; to support efforts to combat climate change and polluting our planet; to assist medical research that will cure diseases of every stripe.

The future of lipstick?  Who the heck knows?

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Coal: A Personal History

It’s January, and much of the country is confronting freezing temperatures, snow, and ice.  I live in San Francisco now, but I vividly remember what life is like in cold-weather climates.

When I was growing up on the North Side of Chicago, my winter garb followed this pattern:

Skirt and blouse, socks (usually short enough to leave my legs largely bare), a woolen coat, and a silk scarf for my head.  Under my coat, I might have added a cardigan sweater.  But during the freezing cold days of winter (nearly every day during a normal Chicago winter), I was always COLD—when I was outside, that is.

My parents were caring and loving, but they followed the norms of most middle-class parents in Chicago during that era.  No one questioned this attire.  I recall shivering whenever our family ventured outside for a special event during the winter.  I especially remember the excitement of going downtown to see the first showing of Disney’s “Cinderella.”  Daddy parked our Chevy at an outdoor parking lot blocks from the theater on State Street, and we bravely faced the winter winds as we made our way there on foot.  I remember being COLD.

School days were somewhat different.  On bitter cold days, girls were allowed to cover our legs, but only if we hung our Levi’s in our lockers when we arrived at school.  We may have added mufflers around our heads and necks to create just a little more warmth as we walked blocks and blocks to school in the morning, back home for lunch, then returning to school for the afternoon.

Looking back, I can’t help wondering why it never occurred to our parents to clothe us more warmly.  Weren’t they aware of the warmer winter clothing worn elsewhere?  One reason that we didn’t adopt warmer winter garb–like thermal underwear, or down jackets, or ski parkas–may have been a lack of awareness that they existed.  Or the answer may have been even simplerthe abundance of coal.

Inside, we were never cold.  Why?  Because heating with coal was ubiquitous.  It heated our apartment buildings, our houses, our schools, our stores, our movie theaters, our libraries, our public buildings, and almost everywhere else.  Radiators heated by coal hissed all winter long.  The result?  Overheated air.

Despite the bleak winter outside, inside I was never cold.  On the contrary, I was probably much too warm in the overheated spaces we inhabited.

Until I was 12, we lived in an apartment with lots of windows.  In winter the radiators were always blazing hot, so hot that we never felt the cold air outside.  The window glass would be covered in condensed moisture, a product of the intensely heated air, and I remember drawing funny faces on the glass that annoyed my scrupulous-housekeeper mother.

Where did all that heat come from?  I never questioned its ultimate source.

I later learned that it was extracted from deep beneath the earth.  But what happened to it above ground was no secret.  More than once, I watched trucks pull up outside my apartment building to deliver large quantities of coal.  The driver would set up a chute that sent the coal directly into the basement, where all those lumps of coal must have been shoveled into a big furnace.

Coal was the primary source of heat back then, and the environment suffered as a result.  After the coal was burned in the furnace, its ashes would be shoveled into bags.  Many of the ashes found their way into the environment.  They were, for example, used on pavements and streets to cope with snow and ice.

The residue from burning coal also led to other harmful results.  Every chimney spewed thick sooty smoke all winter, sending into the air the toxic particles that we all inhaled.

Coal was plentiful, cheap, and reliable.  And few people were able to choose alternatives like fireplaces and wood-burning furnaces (which presented their own problems).

Eventually, cleaner and more easily distributed forms of heating fuel displaced coal.  Residential use dropped, and according to one source, today it amounts to less than one percent of heating fuel.

But coal still plays a big part in our lives.  As Malcolm Turnbull, the former prime minister of Australia (which is currently suffering the consequences of climate change), wrote earlier this month in TIME magazine, the issue of “climate action” has been “hijacked by a toxic, climate-denying alliance of right-wing politics and media…, as well as vested business interests, especially in the coal industry.”  He added:  “Above all, we have to urgently stop burning coal and other fossil fuels.”

In her book Inconspicuous Consumption: the environmental impact you don’t know you have, Tatiana Schlossberg points out that we still get about one-third of our electricity from coal.  So “streaming your online video may be coal-powered.”  Using as her source a 2014 EPA publication, she notes that coal ash remains one of the largest industrial solid-waste streams in the country, largely under-regulated, ending up polluting groundwater, streams, lakes, and rivers across the country.

“As crazy as this might sound,” Schlossberg writes, watching your favorite episode of “The Office” might come at the expense of clean water for someone else.  She’s concerned that even though we know we need electricity to power our computers, we don’t realize that going online itself uses electricity, which often comes from fossil fuels.

Illinois is finally dealing with at least one result of its longtime dependence on coal.   Environmental groups like Earthjustice celebrated a big win in Illinois in 2019 when they helped win passage of milestone legislation strengthening rules for cleaning up the state’s coal-ash dumps.  In a special report, Earthjustice noted that coal ash, the toxic residue of burning coal, has been dumped nationwide into more than 1,000 unlined ponds and landfills, where it leaches into waterways and drinking water.

Illinois in particular has been severely impacted by coal ash.  It is belatedly overhauling its legacy of toxic coal waste and the resulting widespread pollution in groundwater near its 24 coal-ash dumpsites.  The new legislation funds coal-ash cleanup programs and requires polluters to set aside funds to ensure that they, not taxpayers, pay for closure and cleanup of coal-ash dumps.

Earthjustice rightfully trumpets its victory, which will now protect Illinois residents and its waters from future toxic pollution by coal ash.  But what about the legacy of the past, and what about the legacy of toxic coal particles that entered the air decades ago?

As an adult, I wonder about the huge quantities of coal dust I must have inhaled during every six-month-long Chicago winter that I lived through as a child.  I appear to have so far escaped adverse health consequences, but that could change at any time.

And I wonder about others in my generation.  How many of us have suffered or will suffer serious health problems as a result of drinking polluted water and inhaling toxic coal-dust particles?

I suspect that many in my generation have been unwilling victims of our decades-long dependence on coal.

 

 

Who the Heck Knows?

I have a new catch phrase:  “Who the heck knows?”

I started using it last fall, and ever since then I’ve found that it applies to almost everything that might arise in the future.

I don’t claim originality, but here’s how I came up with it:

At a class reunion in October, I was asked to be part of a panel of law school classmates who had veered off the usual lawyer-track and now worked in a totally different area.

Specifically, I was asked to address a simple question:  Why did I leave my work as a lawyer/law professor and decide to focus primarily on writing?

First, I explained that I’d always loved writing, continued to write even while I worked as a lawyer, and left my law-related jobs when they no longer seemed meaningful.  I added that my move to San Francisco led to launching my blog and publishing my first two novels.

I concluded:

“If I stay healthy and my brain keeps functioning, I want to continue to write, with an increasing focus on memoirs….  I’ll keep putting a lot of this kind of stuff on my blog.  And maybe it will turn into a book or books someday.

“Who the heck knows?”

 

After I said all that, I realized that my final sentence was the perfect way to respond to almost any question about the future.

Here’s why it seems to me to apply to almost everything:

None of us knows what the next day will bring.  Still, we think about it.

In “Men Explain Things to Me,” the author Rebecca Solnit notes “that we don’t know what will happen next, and the unlikely and the unimaginable transpire quite regularly.”  She finds uncertainty hopeful, while viewing despair as “a form of certainty,” certainty that that “the future will be a lot like the present or will decline from it.”

Let’s cast certainty aside and agree, with Solnit, that uncertainty is hopeful.  Let’s go on to question what might happen in the uncertain future.

For example:

We wonder whether the midterm elections will change anything.

We wonder whether our kids will choose to follow our career choices or do something totally different.

We wonder whether our family history of a deadly disease will lead to having it ourselves.

We wonder whether to plan a trip to Peru.

We wonder whether we’re saving enough money for retirement.

We wonder how the U.S. Supreme Court will rule in an upcoming case.

We wonder what our hair will look like ten years from now.

We wonder what the weather will be like next week.

And we wonder what the current occupant of the White House will say or do regarding just about anything.

 

You may have an answer in mind, one that’s based on reason or knowledge or probability.   But if you’re uncertain…in almost every case, the best response is:  Who the heck knows?

If you’re stating this response to others, I suggest using “heck” instead of a word that might offend anyone.  It also lends a less serious tone to all of the unknowns out there, some of which are undoubtedly scary.

If you prefer to use a more serious tone, you can of course phrase things differently.

But I think I’ll stick with “Who the heck knows?”

Warning:  If you spend any time with me, you’ll probably hear me say it, again and again.

But then, who the heck knows?

A new book you may want to know about

There’s one thing we can all agree on:  Trying to stay healthy.

That’s why you may want to know about a new book, Killer diseases, modern-day epidemics:  Keys to stopping heart disease, diabetes, cancer, and obesity in their tracks, by Swarna Moldanado, PhD, MPH, and Alex Moldanado, MD.

In this extraordinary book, the authors have pulled together an invaluable compendium of both evidence and advice on how to stop the “killer diseases” they call “modern-day epidemics.”

First, using their accumulated wisdom and experience in public health, nursing science, and family medical practice, Swarna and Alex Moldanado offer the reader a wide array of scientific evidence.  Next, they present their well-thought-out conclusions on how this evidence supports their theories of how to combat the killer diseases that plague us today.

Their most compelling conclusion:  Lifestyle choices have an overwhelming impact on our health.  So although some individuals may suffer from diseases that are unavoidable, evidence points to the tremendous importance of lifestyle choices.

Specifically, the authors note that evidence “points to the fact that some of the most lethal cancers are attributable to lifestyle choices.”  Choosing to smoke tobacco or consume alcohol in excess are examples of the sort of risky lifestyle choices that can lead to this killer disease.

Similarly, cardiovascular diseases–diseases of the heart and blood vessels–share many common risk factors.  Clear evidence demonstrates that eating an unhealthy diet, a diet that includes too many saturated fats—fatty meats, baked goods, and certain dairy products—is a critical factor in the development of cardiovascular disease. The increasing size of food portions in our diet is another risk factor many people may not be aware of.

On the other hand, most of us are aware of the dangers of physical inactivity.  But knowledge of these dangers is not enough.  Many of us must change our lifestyle choices.  Those of us in sedentary careers, for example, must become much more physically active than our lifestyles lend themselves to.

Yes, the basics of this information appear frequently in the media.  But the Moldanados reveal a great deal of scientific evidence you might not know about.

Even more importantly, in Chapter 8, “Making and Keeping the Right Lifestyle Choices,” the authors step up to the plate in a big way.  Here they clearly and forcefully state their specific recommendations for succeeding in the fight against killer diseases.

Following these recommendations could lead all of us to a healthier and brighter outcome.

Kudos to the authors for collecting an enormous volume of evidence, clearly presenting it to us, and concluding with their invaluable recommendations.

No more excuses!  Let’s resolve to follow their advice and move in the right direction to help ensure our good health.

 

 

 

 

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]

 

Feeling Lazy? Blame Evolution

I’m kind of lazy.  I admit it. I like to walk, ride a bike, and splash around in a pool, but I don’t indulge in a lot of exercise beyond that.

Now a Harvard professor named Daniel Lieberman says I can blame human evolution.  In a recent paper, “Is Exercise Really Medicine? An Evolutionary Perspective,” he explains his ideas.

First, he says (and this is the sentence I really like), “It is natural and normal to be physically lazy.”  Why?  Because human evolution has led us to exercise only as much as we must to survive.

We all know that our ancestors lived as hunter-gatherers and that food was often scarce.  Lieberman adds this idea:  Resting was key to conserving energy for survival and reproduction.  “In other words, humans were born to run—but as little as possible.”

As he points out, “No hunter-gatherer goes out for a jog, just for the sake of it….”  Thus, we evolved “to require stimuli from physical activity.”  For example, muscles become bigger and more powerful with use, and they atrophy when they’re not used.  In the human circulatory system, “vigorous activity stimulates expansion of …circulation,” improves the heart’s ability to pump blood, and increases the elasticity of arteries.  But with less exercise, arteries stiffen, the heart pumps less blood, and metabolism slows.

Lieberman emphasizes that this entire process evolved to conserve energy whenever possible.  Muscles use a lot of calories, making them costly to maintain.  Muscle wasting thus evolved as a way to lower energy consumption when physical activity wasn’t required.

What about now?  Until recently, it was never possible in human history to lead an existence devoid of activity.  The result:  According to Lieberman, the mechanisms humans have always used to reduce energy expenditures in the absence of physical activity now manifest as diseases.

So maladies like heart disease, diabetes, and osteoporosis are now the consequences of adaptations that evolved to trim energy demand, and modern medicine is now stuck with treating the symptoms.

In the past, hunter-gatherers had to exercise because if they didn’t, they had nothing to eat.  Securing food was an enormous incentive.  But today, for most humans there are very few incentives to exercise.

How do we change that?  Although there’s “no silver bullet,” Lieberman thinks we can try to make activity “more fun for more people.”  Maybe making exercise more “social” would help.  Community sports like soccer teams and fun-runs might encourage more people to get active.

Lieberman has another suggestion.  At his own university, students are no longer required to take physical education as part of the curriculum.  Harvard voted its physical-education requirement out of existence in the 1970s, and he thinks it’s time to reinstate it.  He notes surveys that show that very few students who are not athletes on a team get sufficient exercise.  A quarter of Harvard undergraduates have reported being sedentary.

Because “study after study shows that…people who get more physical activity have better concentration, their memories are better, they focus better,” Lieberman argues that the time spent exercising is “returned in spades…not only in the short term, but also in the long term.  Shouldn’t we care about the long-term mental and physical health of our students?”

Lieberman makes a powerful argument for reinstating phys-ed in those colleges and universities that have dropped it.  His argument also makes sense for those of us no longer in school.

Let’s foil what the millennia of evolution have done to our bodies and boost our own level of exercise as much as we can.

Tennis, anyone?

 

[Daniel Lieberman’s paper was the focus of an article in the September-October 2016 issue of Harvard Magazine.  He’s the Lerner professor of biological sciences at Harvard.]

 

Put some spice into your (longer) life

Do you like spicy food? I do! So I was happy to learn about the mounting evidence that eating spicy food is linked to a longer life.

The New York Times, CNN, and Time magazine recently reported on a Chinese study of nearly half a million people (487,375, to be exact). The mass of data collected in that study showed an association between eating spicy food and a reduced risk of death.

The study, reported in the medical journal BMJ, included Chinese men and women enrolled between 2004 and 2008 and followed for an average of more than seven years. Using self-reported questionnaires, the researchers analyzed the spicy food consumption of people aged 30 to 70 across 10 regions in China, excluding those with cancer, heart disease, and stroke. The researchers controlled for family medical history, age, education, diabetes, smoking, and a host of other variables.

They found that those eating spicy food, mainly food containing chili peppers, once or twice a week had a 10 percent reduced overall risk for death, compared with those eating spicy food less than once a week. Further, they found that consuming spicy food six to seven times a week reduced the risk even more–14 percent.

Spicy food eaters had lower rates of ischemic heart disease, respiratory diseases, and cancers. (Ischemic heart disease, a common cause of death, arises from a reduced blood supply to the heart, usually caused by atherosclerosis.)

Although the researchers drew no conclusions about cause and effect, they pointed out that capsaicin, the main ingredient in chili peppers, had been found in other studies to have antioxidant and anti-inflammatory effects.

“There is accumulating evidence from mostly experimental research to show the benefit of spices or their active components on human health,” said Lu Qi, an associate professor of nutrition at Harvard’s T. H. Chan School of Public Health and a co-author of the study. But, he added, “we need more evidence, especially from clinical trials, to further verify these findings, and we are looking forward to seeing data from other populations.”

What’s different about spicy foods? The study highlights the benefits of capsaicin, a bioactive ingredient in chili peppers, which has previously been linked to health perks like increased fat-burning.

But most experts emphasize the need for more research. One such expert is Daphne Miller, associate clinical professor at the University of California, San Francisco, and author of “The Jungle Effect: The Healthiest Diets from Around the World, Why They Work and How to Make Them Work for You.”

Miller told CNN that many variables associated with eating spicy food haven’t been addressed in the study. The study itself notes that it lacks information about other dietary and lifestyle habits and how the spicy food was cooked or prepared. “It’s an observational study within a single culture,” she said.

In addition, the researchers note that although chili pepper was the most commonly used spice, the use of other spices tends to increase as the use of chili pepper increases. Consuming these other spices may also result in health benefits.

But Miller said the findings are still plausible, given the fact that spicy foods also have high levels of phenolic content, which are chemicals with nutritional and anti-inflammatory values.

Bio-psychologist John E. Hayes agrees. Hayes, an associate professor of food science and director of the Sensory Evaluation Center at Penn State University, has previously studied spicy food and personality association. According to CNN, he notes that chili intake has an overall protective effect. But why? “Is it a biological mechanism or a behavioral mechanism?”

Eating spicy food might work biologically to increase the basil metabolic rate, says Hayes. But it might also slow food intake, causing a person to eat fewer calories.

Although Lu Qi believes the protective effect associated with spicy foods would translate across cultures, Hayes isn’t sure. When we talk about spicy food, “we can mean vastly different things, with different health implications,” Hayes says. “That spicy food could be…vegetables, like kimchee. Or it could be…barbecued spare ribs.”

“This isn’t an excuse to go out and eat 24 wings and then rationalize it by claiming they are going to make you live longer,” Hayes adds.

Let’s not forget that eating spicy foods also has some risks. Spicy food can create problems for people with incontinence or overactive bladders, according to Kristen Burns, an adult urology nurse-practitioner at Johns Hopkins Hospital in Baltimore. And some believe that spicy foods can aggravate colds or sinus infections.

Another risk is “heartburn.” Does spicy food trigger heartburn in some people? Yes, but not always. According to Lauren Gerson, a gastroenterologist at the California Pacific Medical Center in San Francisco, a lot of her patients with heartburn (more precisely acid reflux disease, or GERD), were told by other doctors to stop eating everything on a list of 10 trigger foods. The list included favorite foods like chocolate and spicy food.

Gerson told Nutrition Action that these patients were “miserable because their heartburn wasn’t much better” even when they gave up all of those foods. Gerson and her then-colleagues at Stanford University screened more than 2,000 studies, looking for evidence that avoiding trigger foods helps curb acid reflux systems. They found that there wasn’t “any data out there that if you stop these foods…, GERD would get any better.”

So when the American College of Gastroenterology updated its treatment guidelines for GERD in 2013, it concluded that there wasn’t enough evidence for doctors to advise cutting out a whole list of foods. Instead, patients are advised to avoid certain foods only if that lessens their symptoms. The key seems to be “individualized trigger avoidance,” allowing many heartburn sufferers to enjoy spicy food, so long as it doesn’t make their symptoms worse.

The bottom line? If you like the taste of spicy food, and it doesn’t trigger any adverse effects (like heartburn or weight-gain from too many calories), you should enthusiastically munch on the spicy foods you love. According to the latest research, you just might prolong your life.

Bon appetit!