Category Archives: human biology

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Another Benefit of Progeny

Being a grandparent?  It’s wonderful.  And I just learned about a benefit of spending time with my grandkids that I never knew.  What’s more, you don’t even have to be a grandparent to share this benefit with me.

If you’re lucky and you already have a grandchild, congrats!  Grandparenthood is an extraordinarily good thing.  Free of the challenges of parenthood, you’ve plunged into a whole new shimmering world. And unless you’ve had to assume parent-like responsibility for your grandchild, you’ll relish the many rewards you’re now entitled to enjoy.

Spending a day with my grandchildren is my idea of a perfectly splendid day.

(By the way, I don’t call it “babysitting”!  I view babysitting as a paid job—a job I did to earn money in my younger days.  By contrast, spending time with my grandkids is a joyful pursuit I welcome doing.)

It’s not always easy to become a grandparent.  We all know you can’t make a grandchild appear with the wave of a magic wand.

First, you need to have a grown child or two.  Next, that child must want to have a child or two of his or her own.  (Let’s just say her.)

That child must be able to produce her own child.  Several routes now make that possible: the old-fashioned way; new ways to conceive and give birth, thanks to medical science; adoption; or becoming a step-parent.  (If you know of any other ways to produce a child, please let me know.)

Sometimes you can wait a long time.  A savvy parent doesn’t ask questions and doesn’t offer advice.  You need to be patient and let your child achieve parenthood whenever and however it works for her.

If, at last, your child has a child of her own, you are now officially a grandparent.

I’ve been lucky to have two exceptional daughters who both have children of their own.  And I delight in their company.

But even though I’ve always reveled in my role as a granny, empirical research has now uncovered a wonderful bonus:  It seems that spending time with your grandkids can significantly lower your risk of dying sooner rather than later.

A research study, published in May 2017 in Evolution & Human Behavior, concluded that caregiving both within and beyond the family is associated with lower mortality for the caregiver. This heartening conclusion seems to apply to every caregiver, grandparent or not.

The researchers from Switzerland, Germany, and Australia looked at data collected over two decades and focused specifically on grandparents. They concluded that “mortality hazards” for grandparents who provided childcare were 37% lower than for grandparents who did not.

Half of the caregiving grandparents lived for about 10 years after they were first interviewed for the highly-respected Berlin Aging Study, while half of those grandparents who did not provide childcare died within 5 years. These results held true even when the researchers controlled for such factors as physical health, age, and socioeconomic status.

What about non-grandparents and childless older adults?  The positive effects of caregiving also extended to them–if they acted as caregivers in some way.  For example, older parents who had no grandchildren but provided practical help to their adult children also lived longer than those who didn’t.

The results of this study are even more significant than they might have been in the past.  According to the federal government’s latest scorecard on aging, there’s been a drop in overall life expectancy. If your goal is to stick around as long as possible, you might want to think about providing care to others, even if they aren’t your own kids or grandkids.

No kids?  No grandkids?  Here’s my suggestion: Enhance your longevity by becoming a grandparent-surrogate.  Even if you think you might have a child or grandchild of your own someday, why not offer to spend time with other people’s kids or grandkids right now?

If you do, you can expect to see little faces light up when you arrive on the scene.  Parents will be forever grateful, and you’ll probably have lots of fun.

How long will each of us live?  Who the heck knows!  But you might as well do what you can to prolong your life.  Spending time with children and grandchildren, your own or others’, is a jim-dandy way to do it.

 

A new book you may want to know about

There’s one thing we can all agree on:  Trying to stay healthy.

That’s why you may want to know about a new book, Killer diseases, modern-day epidemics:  Keys to stopping heart disease, diabetes, cancer, and obesity in their tracks, by Swarna Moldanado, PhD, MPH, and Alex Moldanado, MD.

In this extraordinary book, the authors have pulled together an invaluable compendium of both evidence and advice on how to stop the “killer diseases” they call “modern-day epidemics.”

First, using their accumulated wisdom and experience in public health, nursing science, and family medical practice, Swarna and Alex Moldanado offer the reader a wide array of scientific evidence.  Next, they present their well-thought-out conclusions on how this evidence supports their theories of how to combat the killer diseases that plague us today.

Their most compelling conclusion:  Lifestyle choices have an overwhelming impact on our health.  So although some individuals may suffer from diseases that are unavoidable, evidence points to the tremendous importance of lifestyle choices.

Specifically, the authors note that evidence “points to the fact that some of the most lethal cancers are attributable to lifestyle choices.”  Choosing to smoke tobacco or consume alcohol in excess are examples of the sort of risky lifestyle choices that can lead to this killer disease.

Similarly, cardiovascular diseases–diseases of the heart and blood vessels–share many common risk factors.  Clear evidence demonstrates that eating an unhealthy diet, a diet that includes too many saturated fats—fatty meats, baked goods, and certain dairy products—is a critical factor in the development of cardiovascular disease. The increasing size of food portions in our diet is another risk factor many people may not be aware of.

On the other hand, most of us are aware of the dangers of physical inactivity.  But knowledge of these dangers is not enough.  Many of us must change our lifestyle choices.  Those of us in sedentary careers, for example, must become much more physically active than our lifestyles lend themselves to.

Yes, the basics of this information appear frequently in the media.  But the Moldanados reveal a great deal of scientific evidence you might not know about.

Even more importantly, in Chapter 8, “Making and Keeping the Right Lifestyle Choices,” the authors step up to the plate in a big way.  Here they clearly and forcefully state their specific recommendations for succeeding in the fight against killer diseases.

Following these recommendations could lead all of us to a healthier and brighter outcome.

Kudos to the authors for collecting an enormous volume of evidence, clearly presenting it to us, and concluding with their invaluable recommendations.

No more excuses!  Let’s resolve to follow their advice and move in the right direction to help ensure our good health.

 

 

 

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]

 

Feeling Lazy? Blame Evolution

I’m kind of lazy.  I admit it. I like to walk, ride a bike, and splash around in a pool, but I don’t indulge in a lot of exercise beyond that.

Now a Harvard professor named Daniel Lieberman says I can blame human evolution.  In a recent paper, “Is Exercise Really Medicine? An Evolutionary Perspective,” he explains his ideas.

First, he says (and this is the sentence I really like), “It is natural and normal to be physically lazy.”  Why?  Because human evolution has led us to exercise only as much as we must to survive.

We all know that our ancestors lived as hunter-gatherers and that food was often scarce.  Lieberman adds this idea:  Resting was key to conserving energy for survival and reproduction.  “In other words, humans were born to run—but as little as possible.”

As he points out, “No hunter-gatherer goes out for a jog, just for the sake of it….”  Thus, we evolved “to require stimuli from physical activity.”  For example, muscles become bigger and more powerful with use, and they atrophy when they’re not used.  In the human circulatory system, “vigorous activity stimulates expansion of …circulation,” improves the heart’s ability to pump blood, and increases the elasticity of arteries.  But with less exercise, arteries stiffen, the heart pumps less blood, and metabolism slows.

Lieberman emphasizes that this entire process evolved to conserve energy whenever possible.  Muscles use a lot of calories, making them costly to maintain.  Muscle wasting thus evolved as a way to lower energy consumption when physical activity wasn’t required.

What about now?  Until recently, it was never possible in human history to lead an existence devoid of activity.  The result:  According to Lieberman, the mechanisms humans have always used to reduce energy expenditures in the absence of physical activity now manifest as diseases.

So maladies like heart disease, diabetes, and osteoporosis are now the consequences of adaptations that evolved to trim energy demand, and modern medicine is now stuck with treating the symptoms.

In the past, hunter-gatherers had to exercise because if they didn’t, they had nothing to eat.  Securing food was an enormous incentive.  But today, for most humans there are very few incentives to exercise.

How do we change that?  Although there’s “no silver bullet,” Lieberman thinks we can try to make activity “more fun for more people.”  Maybe making exercise more “social” would help.  Community sports like soccer teams and fun-runs might encourage more people to get active.

Lieberman has another suggestion.  At his own university, students are no longer required to take physical education as part of the curriculum.  Harvard voted its physical-education requirement out of existence in the 1970s, and he thinks it’s time to reinstate it.  He notes surveys that show that very few students who are not athletes on a team get sufficient exercise.  A quarter of Harvard undergraduates have reported being sedentary.

Because “study after study shows that…people who get more physical activity have better concentration, their memories are better, they focus better,” Lieberman argues that the time spent exercising is “returned in spades…not only in the short term, but also in the long term.  Shouldn’t we care about the long-term mental and physical health of our students?”

Lieberman makes a powerful argument for reinstating phys-ed in those colleges and universities that have dropped it.  His argument also makes sense for those of us no longer in school.

Let’s foil what the millennia of evolution have done to our bodies and boost our own level of exercise as much as we can.

Tennis, anyone?

 

[Daniel Lieberman’s paper was the focus of an article in the September-October 2016 issue of Harvard Magazine.  He’s the Lerner professor of biological sciences at Harvard.]