Category Archives: health

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

The Last Straw(s)

A crusade against plastic drinking straws?  Huh?

At first glance, it may strike you as frivolous.  But it’s not.  In fact, it’s pretty darned serious.

In California, the city of Berkeley may kick off such a crusade.   In June, the city council directed its staff to research what would be California’s first city ordinance prohibiting the use of plastic drinking straws in bars, restaurants, and coffee shops.

Berkeley is responding to efforts by nonprofit groups like the Surfrider Foundation that want to eliminate a significant source of pollution in our oceans, lakes, and other bodies of water. According to the conservation group Save the Bay, the annual cleanup days held on California beaches have found that plastic straws and stirrers are the sixth most common kind of litter.  If they’re on our beaches, they’re flowing into the San Francisco Bay, into the Pacific Ocean, and ultimately into oceans all over the world.

As City Councilwoman Sophie Hahn, a co-author of the proposal to study the ban, has noted, “They are not biodegradable, and there are alternatives.”

I’ve been told that plastic straws aren’t recyclable, either.  So whenever I find myself using a plastic straw to slurp my drink, I conscientiously separate my waste:  my can of Coke Zero goes into the recycling bin; my plastic straw goes into the landfill bin.  This is nuts.  Banning plastic straws in favor of paper ones is the answer.

Realistically, it may be a tough fight to ban plastic straws because business interests (like the Monster Straw Co. in Laguna Beach) want to keep making and selling them.  And business owners claim that they’re more cost-effective, leading customers to prefer them.  As Monster’s founder and owner, Natalie Buketov, told the SF Chronicle, “right now the public wants cheap plastic straws.”

Berkeley could vote on a ban by early 2018.

On the restaurant front, some chefs would like to see the end of plastic straws.  Spearheading a growing movement to steer eateries away from serving straws is Marcel Vigneron, owner-chef of Wolf Restaurant on Melrose Avenue in L.A.  Vigneron, who’s appeared on TV’s “Top Chef” and “Iron Chef,” is also an enthusiastic surfer, and he’s seen the impact of straw-pollution on the beaches and marine wildlife.  He likes the moniker “Straws Suck” to promote his effort to move away from straws, especially the play on words:  “You actually use straws to suck, and they suck because they pollute the oceans,” he told CBS in July.

Vigneron added that if a customer wants a straw, his restaurant has them.  But servers ask customers whether they want a straw instead of automatically putting them into customers’ drinks.  He notes that every day, 500 million straws are used in the U.S., and they could “fill up 127 school buses.”  He wants to change all that.

Drinking straws have a long history.  Their origins were apparently actual straw, or other straw-like grasses and plants.  The first paper straw, made from paper coated with paraffin wax, was patented in 1888 by Marvin Stone, who didn’t like the flavor of a rye grass straw added to his mint julep.  The “bendy” paper straw was patented in 1937.  But the plastic straw took off, along with many other plastic innovations, in the 1960s, and nowadays they’re difficult to avoid.

Campaigns like Surfrider’s have taken off because of mounting concern with plastic pollution.  Surfrider, which has also campaigned against other threats to our oceans, like plastic bags and cigarette butts, supports the “Straws Suck” effort, and according to author David Suzuki, Bacardi has joined with Surfrider in the movement to ban plastic straws.

Our neighbors to the north have already leaped ahead of California.  The town of Tofino in British Columbia claims that it mounted the very first “Straws Suck” campaign in 2016.  By Earth Day in April that year, almost every local business had banned plastic straws.  A fascinating story describing this effort appeared in the Vancouver Sun on April 22, 2016.

All of us in the U.S., indeed the world, need to pay attention to what plastic is doing to our environment.  “At the current rate, we are really headed toward a plastic planet,” according to the author of a study reported in the journal Science Advances, reported by AP in July.  Roland Geyer, an industrial ecologist at UC Santa Barbara, noted that there’s enough discarded plastic to bury Manhattan under more than 2 miles of trash.

Geyer used the plastics industry’s own data to find that the amount of plastics made and thrown out is accelerating.  In 2015, the world created more than twice as much as it made in 1998.

The plastics industry has fought back, relying on the standard of cost-effectiveness.  It claims that alternatives to plastic, like glass, paper, or aluminum, would require more energy to produce.  But even if that’s true, the energy difference in the case of items like drinking straws would probably be minimal.  If we substitute paper straws for plastic ones, the cost difference would likely be negligible, while the difference for our environment—eliminating all those plastic straws floating around in our waterways–could be significant.

Aside from city bans and eco-conscious restaurateurs, we need to challenge entities like Starbucks.  The mega-coffee-company and coffeehouse-chain prominently offers, even flaunts, brightly-colored plastic straws for customers sipping its cold drinks.  What’s worse:  they happily sell them to others!  Just check out the Starbucks straws for sale on Amazon.com.  Knowing what we know about plastic pollution, I think Starbucks’s choice to further pollute our environment by selling its plastic straws on the Internet is unforgivable.

At the end of the day, isn’t this really the last straw?

 

Declare Your Independence: Those High Heels Are Killers

I’ve long maintained that high heels are killers.  I never used that term literally, of course.  I merely viewed high-heeled shoes as distinctly uncomfortable and an outrageous concession to the dictates of fashion that can lead to both pain and permanent damage to a woman’s body.

A few years ago, however, high heels proved to be actual killers.  The Associated Press reported that two women, ages 18 and 23, were killed in Riverside, California, as they struggled in high heels to get away from a train.  With their car stuck on the tracks, the women attempted to flee as the train approached.  A police spokesman later said, “It appears they were in high heels and [had] a hard time getting away quickly.”

Like those young women, I was sucked into wearing high heels when I was a teenager.  It was de rigueur for girls at my high school to seek out the trendy shoe stores on State Street in downtown Chicago and purchase whichever high-heeled offerings our wallets could afford.  On my first visit, I was entranced by the three-inch-heeled numbers that pushed my toes into a too-narrow space and revealed them in what I thought was a highly provocative position.  If feet can have cleavage, those shoes gave me cleavage.

Never mind that my feet were encased in a vise-like grip.  Never mind that I walked unsteadily on the stilts beneath my soles.  And never mind that my whole body was pitched forward in an ungainly manner as I propelled myself around the store.  I liked the way my legs looked in those shoes, and I had just enough baby-sitting money to pay for them.  Now I could stride with pride to the next Sweet Sixteen luncheon on my calendar, wearing footwear like all the other girls’.

That luncheon revealed what an unwise purchase I’d made.  When the event was over, I found myself stranded in a distant location with no ride home, and I started walking to the nearest bus stop.  After a few steps, it was clear that my shoes were killers.  I could barely put one foot in front of the other, and the pain became so great that I removed my shoes and walked in stocking feet the rest of the way.

After that painful lesson, I abandoned three-inch high-heeled shoes and resorted to wearing lower ones.   Sure, I couldn’t flaunt my shapely legs quite as effectively, but I managed to secure male attention nevertheless.

Instead of conforming to the modern-day equivalent of Chinese foot-binding, I successfully and happily fended off the back pain, foot pain, bunions, and corns that my fashion-victim sisters suffer in spades.

The recent trend toward higher and higher heels is disturbing.  I’m baffled by women, especially young women, who buy into the mindset that they must follow the dictates of fashion and the need to look “sexy” by wearing extremely high heels.

When I watch TV, I see too many women wearing stilettos that force them into the ungainly walk I briefly sported so long ago.  I can’t help noticing the women on late-night TV shows who are otherwise smartly attired and often very smart (in the other sense of the word), yet wear ridiculously high heels that force them to greet their hosts with that same ungainly walk.  Some appear on the verge of toppling over.  And at a recent Oscar awards telecast, women tottered to the stage in ultra-high heels, often accompanied by escorts who kindly held onto them to prevent their embarrassing descent into the orchestra pit.

The women who, like me, have adopted lower-heeled shoes strike me as much smarter and much less likely to fall on their attractive (and sometimes surgically-enhanced) faces.

Here’s another example.  When I sat on the stage of Zellerbach Hall at the Berkeley commencement for math students a few years ago, I was astonished that many if not most of the women graduates hobbled across the stage to receive their diplomas in three- and four-inch-high sandals.  I was terrified that these super-smart math students would trip and fall before they could grasp the document their mighty brain-power had earned.  (Fortunately, none of them tripped, but I could nevertheless imagine the foot-pain that accompanied the joy of receiving their degrees.)

Foot-care professionals soundly support my view.   According to the American Podiatric Medical Association, a heel that’s more than 2 or 3 inches makes comfort just about impossible.  Why?  Because a 3-inch heel creates seven times more stress than a 1-inch heel.

The San Francisco Chronicle recently questioned Dr. Amol Saxena, a podiatrist and foot and ankle surgeon who practices in Palo Alto (and assists Nike’s running team).  He explained that after 1.5 inches, the pressure increases on the ball of the foot and can lead to “ball-of-the-foot numbness.”  (Yikes!)  He doesn’t endorse 3-inch heels and points out that celebrities wear them for only a short time (for example, on the red carpet), not all day.  To ensure a truly comfortable shoe, he adds, don’t go above a 1.5 inch heel.  If you insist on wearing higher heels, limit how much time you spend in them.

Some encouraging changes may be afoot.  The latest catalog from Nordstrom, one of America’s major shoe-sellers, features a large number of lower-heeled styles along with higher-heeled numbers.  Because Nordstrom is a bellwether in the fashion world, its choices can influence shoe-seekers.  Or is Nordstrom reflecting what its shoppers have already told the stores’ decision-makers?  The almighty power of the purse—how shoppers are choosing to spend their money–probably plays a big role here.

Beyond the issue of comfort, let’s remember that high heels present a far more urgent problem.  As the deaths in Riverside demonstrate, women who wear high heels can be putting their lives at risk.  When women need to flee a dangerous situation, it’s pretty obvious that high heels can handicap their ability to escape.

How many other needless deaths have resulted from hobbled feet?

The Fourth of July is fast approaching.  As we celebrate the holiday this year, I urge the women of America to declare their independence from high-heeled shoes.

If you’re currently wearing painful footwear, bravely throw those shoes away, or at the very least, toss them into the back of your closet.   Shod yourself instead in shoes that allow you to walk—and if need be, run—in comfort.

Your wretched appendages, yearning to be free, will be forever grateful.

 

[Earlier versions of this commentary appeared on Susan Just Writes and the San Francisco Chronicle.]

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Munching on Meatloaf

Meatloaf, that old standby, has just acquired a new cachet.  Or has it?

A recent column by Frank Bruni in The New York Times focused on food snobs, in particular their ridicule of Donald Trump’s love of meatloaf.  Weeks earlier, Trump had “forced Chris Christie to follow his lead at a White House lunch and eat meatloaf, which the president praised as his favorite item on the menu.”

According to Bruni, a former restaurant critic, news coverage of the lunch “hinted that Trump wasn’t merely a bully but also a rube.  What grown-up could possibly be so fond of this retro, frumpy dish?”

Bruni’s answer:  “Um, me.  I serve meatloaf at dinner parties.  I devoted a whole cookbook to it.”

Allow me to join forces with Frank Bruni.  Putting aside my general negativity towards all things Trump, I have to admit I’m fond of meatloaf, too.

My recollections of eating meatloaf go back to the dining-room table in our West Rogers Park apartment in the 1950s.  My mother was never an enthusiastic cook.  She prepared meals for us with a minimal degree of joy, no doubt wishing she could spend her time on other pursuits.  It was simply expected of her, as the wife and mother in our mid-century American family, to come up with some sort of breakfast, lunch, and dinner nearly every day.

Breakfasts rarely featured much more than packaged cereal and milk.  I remember putting a dusting of sugar on corn flakes—something I haven’t done since childhood.  Did we add fresh fruit to our cereal?  Not very often.  We might have added raisins.   But fresh fruit, like the abundant blueberries and strawberries we can now purchase all year long, wasn’t available in Chicago grocery stores during our long cold ‘50s winters.  At least not in our income bracket.

Daddy occasionally made breakfast on the weekends.  I remember watching him standing in front of our ‘30s-style mint green enamel-covered stove, whipping up his specialty, onions and eggs, with aplomb.  But those highly-anticipated breakfasts were rare.

[I recently discovered that stoves like that one are still available.  They’re advertised online by a “retro décor lover’s dream resource” in Burbank, as well as on eBay, where an updated model is currently listed for $4,495.]

As for lunch, my public grade school compelled us to walk home for lunch every day.  Only a handful of sub-zero days broke that mold.  Our school had no cafeteria, or even a lunchroom, where kids could eat in frigid weather.  Only on alarmingly cold days were we permitted to bring a lunch from home and eat it in the school auditorium.  If we pleaded convincingly enough, our parents might let us buy greasy hamburgers at Miller’s School Store.

Most days I’d walk home, trudging the six long blocks from school to home and back within an hour. Mom would have lunch waiting for me on our breakfast-room table, mostly sandwiches and the occasional soup.  Mom rarely made her own soup.  She generally opened cans of Campbell’s “vegetable vegetarian,” eschewing canned soups that included any possibility of unknown meat.

Mom’s dinner specialties included iceberg-lettuce salads, cooked veggies and/or potatoes, and a protein of some kind.  Because of her upbringing, she invariably chose fish, poultry, or cuts of meats like ground beef, beef brisket, and lamb chops.

Which brings us to meatloaf.

I must have liked Mom’s meatloaf because I don’t have a single negative memory associated with it.  And when I got married and began preparing meals for my own family, I never hesitated to make meatloaf myself.

Fortunately, I didn’t have to prepare dinner every night.  I was immensely lucky to marry a man who actually enjoyed cooking.  Although I inherited my mother’s reluctance to spend much time in the kitchen, Herb relished preparing elaborate gourmet dishes á la Julia Child—in fact, he often used her cookbook—and proudly presenting them to our daughters and me whenever his schedule allowed.

But when I was the cook, meatloaf was one of my favorite choices.  I’d buy lean ground beef, add breadcrumbs, ketchup, and assorted herbs and spices, mix it all together with my bare hands, and heat the finished product until it was just right.  Aware by then of warnings about high-fat red meat, I’d carefully remove my loaf pan from the oven and scrupulously drain as much fat from the pan as I could.  The result?  A tasty and relatively low-fat dish.  My family loved it.

At some point I discovered the glories of leftover meatloaf.  Chilled in the fridge overnight, it made a toothsome sandwich the next day.  It was especially good on rye bread and loaded with ketchup.  Wrapped in a plastic baggie, it would go from home to wherever I traveled to work, and I had to use my most stalwart powers of self-discipline to wait till lunchtime to bite into its deliciousness.

Those days are sadly over.  I rarely prepare dinner for my family anymore, and my consumption of meat products has gone way down.  Most days, when I reflect on what I’ve eaten, I realize that, more often than not, I’ve unknowingly eaten a wholly vegetarian diet.

I haven’t eaten meatloaf in years.  But hearing about Trump’s penchant for it has awakened my tastebuds.  If I could just get my hands on a tasty low-fat version like the one I used to make, my long meatloaf-drought might finally be over.

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]

 

Feeling Lazy? Blame Evolution

I’m kind of lazy.  I admit it. I like to walk, ride a bike, and splash around in a pool, but I don’t indulge in a lot of exercise beyond that.

Now a Harvard professor named Daniel Lieberman says I can blame human evolution.  In a recent paper, “Is Exercise Really Medicine? An Evolutionary Perspective,” he explains his ideas.

First, he says (and this is the sentence I really like), “It is natural and normal to be physically lazy.”  Why?  Because human evolution has led us to exercise only as much as we must to survive.

We all know that our ancestors lived as hunter-gatherers and that food was often scarce.  Lieberman adds this idea:  Resting was key to conserving energy for survival and reproduction.  “In other words, humans were born to run—but as little as possible.”

As he points out, “No hunter-gatherer goes out for a jog, just for the sake of it….”  Thus, we evolved “to require stimuli from physical activity.”  For example, muscles become bigger and more powerful with use, and they atrophy when they’re not used.  In the human circulatory system, “vigorous activity stimulates expansion of …circulation,” improves the heart’s ability to pump blood, and increases the elasticity of arteries.  But with less exercise, arteries stiffen, the heart pumps less blood, and metabolism slows.

Lieberman emphasizes that this entire process evolved to conserve energy whenever possible.  Muscles use a lot of calories, making them costly to maintain.  Muscle wasting thus evolved as a way to lower energy consumption when physical activity wasn’t required.

What about now?  Until recently, it was never possible in human history to lead an existence devoid of activity.  The result:  According to Lieberman, the mechanisms humans have always used to reduce energy expenditures in the absence of physical activity now manifest as diseases.

So maladies like heart disease, diabetes, and osteoporosis are now the consequences of adaptations that evolved to trim energy demand, and modern medicine is now stuck with treating the symptoms.

In the past, hunter-gatherers had to exercise because if they didn’t, they had nothing to eat.  Securing food was an enormous incentive.  But today, for most humans there are very few incentives to exercise.

How do we change that?  Although there’s “no silver bullet,” Lieberman thinks we can try to make activity “more fun for more people.”  Maybe making exercise more “social” would help.  Community sports like soccer teams and fun-runs might encourage more people to get active.

Lieberman has another suggestion.  At his own university, students are no longer required to take physical education as part of the curriculum.  Harvard voted its physical-education requirement out of existence in the 1970s, and he thinks it’s time to reinstate it.  He notes surveys that show that very few students who are not athletes on a team get sufficient exercise.  A quarter of Harvard undergraduates have reported being sedentary.

Because “study after study shows that…people who get more physical activity have better concentration, their memories are better, they focus better,” Lieberman argues that the time spent exercising is “returned in spades…not only in the short term, but also in the long term.  Shouldn’t we care about the long-term mental and physical health of our students?”

Lieberman makes a powerful argument for reinstating phys-ed in those colleges and universities that have dropped it.  His argument also makes sense for those of us no longer in school.

Let’s foil what the millennia of evolution have done to our bodies and boost our own level of exercise as much as we can.

Tennis, anyone?

 

[Daniel Lieberman’s paper was the focus of an article in the September-October 2016 issue of Harvard Magazine.  He’s the Lerner professor of biological sciences at Harvard.]

 

The Pink Lady

When I was growing up, my mother’s cocktail of choice was a “pink lady.” Whenever our family went out for dinner (and those dinners-out didn’t happen often), she’d order a frothy and very rosy-hued “pink lady” while Daddy chose an “old-fashioned.”

My parents weren’t everyday drinkers. Au contraire. My mother would sometimes speak disparagingly of those who indulged overmuch in alcoholic beverages, referring to them as “shikkers.” Although Daddy may have had an occasional drink at home after a difficult day at work (probably bourbon or another kind of whiskey), Mom never did. She reserved her pursuit of alcohol for our occasional dinners-out.

One dinner spot we favored was the Fireside Restaurant in Lincolnwood, Illinois, not far from our apartment on the Far North Side of Chicago. (Ironically, the restaurant was itself destroyed by fire–reputedly by mob-related arson–a few years later.) Another place we patronized was Phil Smidt’s (which everyone pronounced like “Schmidt’s”), located just over the Indiana border.

Why did we travel to Indiana for dinner when good food was undoubtedly available to us much closer to home? And long before an interstate highway connected Chicago to Northern Indiana? I remember a prolonged and very slow trip on surface streets and maybe a small highway or two whenever we headed to Phil Smidt’s.

Perhaps we wound up there because the restaurant was a perennial favorite among the people my parents knew. Or perhaps because my father actually enjoyed driving. Yes, Daddy liked getting behind the wheel in those long-ago days before everyone had a car and the roads weren’t jam-packed with other drivers. Daddy got a kick out of driving us in every direction from our home on Sunday afternoons, when traffic was especially light. But I also remember his frustration with drivers who didn’t seem to know where they were going. He referred to them as “farmers,” implying that they were wide-eyed rural types unaccustomed to city driving.

Perhaps we headed to Indiana because my parents were overly enthusiastic about the fare offered at Phil Smidt’s. As I recall, the place was famous for fried perch and fried chicken. I usually opted for the fried chicken. (At the Fireside Restaurant, my first choice was French-fried shrimp. Dinners-out seemed to involve a lot of fried food back then, and oh, my poor arteries.)

If we were celebrating a special event, like my mother’s birthday or Mother’s Day, Mom would wear a corsage. I’ve never been especially fond of corsages, which were de rigueur during my high school prom-going days. Boys would bring their dates a corsage, and girls were expected to ooh and aah over them. But I always thought corsages were a highly artificial way to display fresh flowers, and I rejected them whenever I had a choice. I’m glad social norms have evolved to diminish the wearing of corsages like those women and girls formerly felt compelled to wear.

Mom, however, always seemed pleased to wear the corsage Daddy gave her. Her favorite flower was the gardenia, and its strong scent undoubtedly wafted its way toward her elegantly shaped nose whenever he pinned one on her dress.

The “pink lady” cocktail, which incorporates gin as its basic ingredient, first appeared early in the 20th century. Some speculate that its name was inspired by a 1911 Broadway musical whose name and whose star were both called “The Pink Lady.”   It may have become popular during Prohibition, when the gin available was so dreadful that people added flavors like grenadine to obscure its bad taste.

The cocktail evolved into a number of different varieties over the years. Mom’s frothy version, around since the 1920s, adds sweet cream to the usual recipe of gin, grenadine (which provides flavoring and the pink color), and egg white.

Apparently (and not surprisingly), the drink eventually acquired a “feminine” image, both because of its name and because its sweet and creamy content wasn’t viewed as “masculine” enough in the eyes of male critics. One bartender also speculated that the non-threatening appearance of the “pink lady” probably was a major reason why it appealed to women who had limited experience with alcohol.

No doubt Mom was one of those women.

The very name of the cocktail, the “pink lady,” fit Mom to a T. She was absolutely determined to be a “lady” in every way and to instill “lady-like” behavior in her two daughters. I was frequently admonished to repress my most rambunctious ways by being told I wasn’t being lady-like. And when I had two daughters of my own, decades later, despite my strong opposition she still repeated the same admonition. She found it hard to shift gears and approve of her granddaughters’ behaving in what she viewed as a non-lady-like way. Although her basic sweetness, like that of her favorite drink, predominated in our relationship, we did differ on issues like that one.

The appellation of “pink lady” fit Mom in another way as well. She was a redhead whose fair skin would easily flush, lending a pink hue to her appearance. Whenever she was agitated (sometimes because my sister or I provoked her)…or whenever she excitedly took pride in one of our accomplishments…and assuredly whenever she was out in the sun too long, she literally turned pink.

So here’s to you, Pink Lady. In my memory, you’ll always resemble the very pink and very sweet cocktail you preferred.

P.S. re Sugar

Sugar has been the focus of two of my previous posts, the October 2015 post on chewing sugar-free gum to avoid tooth decay (“Chew on This”) and a more general indictment of sugar in October 2014 (“Gimme a Little Sugar”).

I now have a P.S. to add to those.

According to the Union of Concerned Scientists (UCS), the FDA has endorsed a proposed revision to the Nutrition Facts label that appears on about 700,000 packaged food items. The new label will give consumers more information about the sugar hidden in their food.

Here’s the proposed change: labels will specify the amount of “added sugars” in a product. In other words, it will highlight the sugar that doesn’t naturally occur in the product’s other ingredients. It will also include the percentage of an adult’s recommended daily intake of sugar this added sugar represents. Significantly, it will caution consumers to “AVOID TOO MUCH” of these added sugars.

The US calls this “a win for science” because it validates the strong scientific evidence that consuming too much sugar contributes to diseases affecting millions of Americans. It’s a major win because scientists were up against both the sugar lobby and the powerful packaged-food industry’s lobbyists, all of whom fought against the proposed change.

It’s also a win for public health because “Americans remain remarkably uninformed about the health dangers of excessive sugar intake” and even about how much sugar they’re already consuming. The average is more than 19 teaspoons of sugar every day! And an estimated 74 percent of all packaged foods—including many presumably non-sweet products like soups, salad dressings, and crackers—contain added sugar.

The UCS will continue to fight for the proposed change in hopes that the new label is finalized soon.

This info appears in the Fall 2015 issue of Catalyst, a UCS publication.

Chew on this

During the holiday season–spanning Halloween, Thanksgiving, and the December holidays–most of us worry about our consumption of sugary candy and desserts.

We should worry. Sugar not only adds calories but it also can lead to other health problems. For one thing, sugar clearly leads to problems with our teeth. It’s well established that the bacteria in our mouths combines with sugar to create an acid that causes tooth decay.

There’s a useful remedy for the tooth problem. No, not the one that immediately comes to mind.

Sure, you can brush your teeth right after consuming sugar-loaded food and drink. But how many of us do it?

Until something else comes along (and it inevitably will, thanks to researchers like the ones I noted in my blog post “Beavers? Seriously?” last March), here’s one thing you can try: chewing sugar-free gum.

In October, The Wall Street Journal highlighted how chewing gum can help reduce tooth decay. It quoted a spokeswoman for the American Dental Association–a family dentist in Fremont, California, Dr. Ruchi Sahota–on the virtues of sugar-free gum. According to Dr. Sahota, chewing gum after eating stimulates saliva, and that can prevent cavities.

Why? Naturally occurring saliva helps to neutralize the mouth by reducing the acids produced by bacteria in food, and those acids are what ultimately cause cavities. Chewing sugar-free gum can reduce the amount of the bacteria-happy acid. In 2007, the ADA began including chewing gum in its Seal of Approval program. But only sugarless gums can qualify (other gums contain the kinds of sugars used as food by bacteria).

Sugar-free gums typically use artificial sweeteners, most of which are created in a lab, and there’s been some discussion of whether they are safe. But concerns about their being carcinogenic have been dismissed by the FDA for lack of clear evidence.

Some dentists promote chewing gum sweetened with xylitol, a sugar alcohol that usually derives from wood fiber. Studies have shown that it adds mineral to tooth enamel, and one study showed that it can inhibit the growth of bacteria that stick to teeth.

But recent analysis concluded that there was insufficient evidence that xylitol can help prevent cavities. So Dr. Sahota told the Journal that the research “isn’t conclusive enough” to promote gums with xylitol over other sugar-free gums.

Although some dentists recommend chewing sugarless gum for at least 20 minutes to get the full anti-bacterial effect, Dr. Sahota disagrees. She advises moderation, cautioning people “not to overchew,” which can be hard on the jaw and tooth enamel.

Regarding candy, Dr. S. recommends avoiding sticky or hard candies because they’re the worst cavity-causing villains. Chocolate is much better for your teeth because it washes away more easily than other candies. Yay, chocolate!

As an inveterate gum-chewer, I’m happy to learn that all those sticks of sugar-free gum I chew can help me avoid tooth decay.

But “candy is candy.” So although chewing gum may help forestall the worst effects of coating our teeth with sugar, we need to remember that a toothbrush will do an even better job of scouring all that sugar off our teeth.

Enjoy those sugary holiday treats. But don’t forget to keep some sugar-free gum handy to pop in your mouth when you’re done. Even rinsing your mouth with water ought to help. And at bedtime, if not before, head for your trusty Sonicare or Oral-B.

Once your teeth are properly scoured, you can drift off to sleep, those visions of sugar plums dancing in your head.