Category Archives: medical research

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

A new book you may want to know about

There’s one thing we can all agree on:  Trying to stay healthy.

That’s why you may want to know about a new book, Killer diseases, modern-day epidemics:  Keys to stopping heart disease, diabetes, cancer, and obesity in their tracks, by Swarna Moldanado, PhD, MPH, and Alex Moldanado, MD.

In this extraordinary book, the authors have pulled together an invaluable compendium of both evidence and advice on how to stop the “killer diseases” they call “modern-day epidemics.”

First, using their accumulated wisdom and experience in public health, nursing science, and family medical practice, Swarna and Alex Moldanado offer the reader a wide array of scientific evidence.  Next, they present their well-thought-out conclusions on how this evidence supports their theories of how to combat the killer diseases that plague us today.

Their most compelling conclusion:  Lifestyle choices have an overwhelming impact on our health.  So although some individuals may suffer from diseases that are unavoidable, evidence points to the tremendous importance of lifestyle choices.

Specifically, the authors note that evidence “points to the fact that some of the most lethal cancers are attributable to lifestyle choices.”  Choosing to smoke tobacco or consume alcohol in excess are examples of the sort of risky lifestyle choices that can lead to this killer disease.

Similarly, cardiovascular diseases–diseases of the heart and blood vessels–share many common risk factors.  Clear evidence demonstrates that eating an unhealthy diet, a diet that includes too many saturated fats—fatty meats, baked goods, and certain dairy products—is a critical factor in the development of cardiovascular disease. The increasing size of food portions in our diet is another risk factor many people may not be aware of.

On the other hand, most of us are aware of the dangers of physical inactivity.  But knowledge of these dangers is not enough.  Many of us must change our lifestyle choices.  Those of us in sedentary careers, for example, must become much more physically active than our lifestyles lend themselves to.

Yes, the basics of this information appear frequently in the media.  But the Moldanados reveal a great deal of scientific evidence you might not know about.

Even more importantly, in Chapter 8, “Making and Keeping the Right Lifestyle Choices,” the authors step up to the plate in a big way.  Here they clearly and forcefully state their specific recommendations for succeeding in the fight against killer diseases.

Following these recommendations could lead all of us to a healthier and brighter outcome.

Kudos to the authors for collecting an enormous volume of evidence, clearly presenting it to us, and concluding with their invaluable recommendations.

No more excuses!  Let’s resolve to follow their advice and move in the right direction to help ensure our good health.

 

 

 

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

Feeling Lazy? Blame Evolution

I’m kind of lazy.  I admit it. I like to walk, ride a bike, and splash around in a pool, but I don’t indulge in a lot of exercise beyond that.

Now a Harvard professor named Daniel Lieberman says I can blame human evolution.  In a recent paper, “Is Exercise Really Medicine? An Evolutionary Perspective,” he explains his ideas.

First, he says (and this is the sentence I really like), “It is natural and normal to be physically lazy.”  Why?  Because human evolution has led us to exercise only as much as we must to survive.

We all know that our ancestors lived as hunter-gatherers and that food was often scarce.  Lieberman adds this idea:  Resting was key to conserving energy for survival and reproduction.  “In other words, humans were born to run—but as little as possible.”

As he points out, “No hunter-gatherer goes out for a jog, just for the sake of it….”  Thus, we evolved “to require stimuli from physical activity.”  For example, muscles become bigger and more powerful with use, and they atrophy when they’re not used.  In the human circulatory system, “vigorous activity stimulates expansion of …circulation,” improves the heart’s ability to pump blood, and increases the elasticity of arteries.  But with less exercise, arteries stiffen, the heart pumps less blood, and metabolism slows.

Lieberman emphasizes that this entire process evolved to conserve energy whenever possible.  Muscles use a lot of calories, making them costly to maintain.  Muscle wasting thus evolved as a way to lower energy consumption when physical activity wasn’t required.

What about now?  Until recently, it was never possible in human history to lead an existence devoid of activity.  The result:  According to Lieberman, the mechanisms humans have always used to reduce energy expenditures in the absence of physical activity now manifest as diseases.

So maladies like heart disease, diabetes, and osteoporosis are now the consequences of adaptations that evolved to trim energy demand, and modern medicine is now stuck with treating the symptoms.

In the past, hunter-gatherers had to exercise because if they didn’t, they had nothing to eat.  Securing food was an enormous incentive.  But today, for most humans there are very few incentives to exercise.

How do we change that?  Although there’s “no silver bullet,” Lieberman thinks we can try to make activity “more fun for more people.”  Maybe making exercise more “social” would help.  Community sports like soccer teams and fun-runs might encourage more people to get active.

Lieberman has another suggestion.  At his own university, students are no longer required to take physical education as part of the curriculum.  Harvard voted its physical-education requirement out of existence in the 1970s, and he thinks it’s time to reinstate it.  He notes surveys that show that very few students who are not athletes on a team get sufficient exercise.  A quarter of Harvard undergraduates have reported being sedentary.

Because “study after study shows that…people who get more physical activity have better concentration, their memories are better, they focus better,” Lieberman argues that the time spent exercising is “returned in spades…not only in the short term, but also in the long term.  Shouldn’t we care about the long-term mental and physical health of our students?”

Lieberman makes a powerful argument for reinstating phys-ed in those colleges and universities that have dropped it.  His argument also makes sense for those of us no longer in school.

Let’s foil what the millennia of evolution have done to our bodies and boost our own level of exercise as much as we can.

Tennis, anyone?

 

[Daniel Lieberman’s paper was the focus of an article in the September-October 2016 issue of Harvard Magazine.  He’s the Lerner professor of biological sciences at Harvard.]

 

It’s Gonna Be a Bright, Bright, Bright, Sunshiny Day

Summer has arrived, and with it, lots and lots of sunshine. Americans love sunshine and flock to it whenever we can.
But sunshine isn’t an unmitigated benefit. Dangers lurk in those rays. Most of us know by now that harmful UV (ultraviolet) light can be toxic, and we use sunscreen (more or less religiously) to deter the most harmful effects. The CDC (the U.S. Centers for Disease Control and Prevention) recommends avoiding prolonged exposure to the sun and wearing sunscreen with a minimum of 15 SPF.
So what else is new? Well, a raft of recent studies has in fact produced some brand-new information about sunshine.
First, researchers at Harvard have turned up startling evidence that may explain why some people don’t restrict their time in the sun even though they’re aware of the dangers. It seems that basking in those UV rays can be addictive. According to a new study from Harvard Medical School, investigators at Mass General Hospital have found that chronic UV exposure raises “circulating levels of beta-endorphin” in mice, and that mice who become accustomed to those levels exhibit “withdrawal symptoms” when the beta-endorphin activity is blocked.
What is beta-endorphin? Most of us have heard of endorphins, powerful neurotransmitters that originate in the human body, controlling emotions and often blocking pain. A “beta-endorphin” is one kind of endorphin, considered to be not only stronger than morphine as a pain-blocker but also producing feelings of pleasure. Beta-endorphin is much like an opioid, a/k/a an opiate, a drug used by the medical profession to treat pain. Opioids/opiates impact the brain, leading to feelings of intense pleasure, and addiction—both physical and mental—can develop very quickly.
According to lead researcher David E. Fisher, the Wigglesworth Professor and Chair of Dermatology at HMS and Mass General, the Harvard study found that UV radiation produced “opiate-like effects, including addictive behavior.” The study, reported in the June 19 issue of Cell, may explain what Fisher calls “the ‘sun-seeking’ behavior that may underlie the relentless rise in most forms of skin cancer.”
The study uncovered mounting evidence of addictive behavior among humans as well as mice. Several studies have noted addiction-like behavior in people using indoor tanning facilities. Other studies found that an “opioid blocker” produced withdrawal-like symptoms in these frequent tanners.
Let’s look at what the Harvard researchers did. For six weeks, they delivered a daily dose of UV light—equal to the exposure of fair-skinned humans to 20 or 30 minutes of midday Florida sun—on the shaved backs of a group of mice. Within a week, levels of beta-endorphin in their blood rose significantly. The levels remained elevated during the study period, gradually returning to normal only after UV exposure ended. When the researchers administered naloxone, an “opioid blocker” known to block opioid-pathway activity, the mice had such classic symptoms of opioid withdrawal as trembling, shaking, and teeth-chattering.
Fisher concluded that humans might react the same way, noting that “a natural mechanism reinforcing UV-seeking behavior may have developed during mammalian evolution.” Why? Probably because we synthesize vitamin D (increasingly viewed as an essential nutrient) from the UV light in sunshine. Low blood levels of vitamin D can lead to a host of ailments. But sun-seeking behavior, Fisher warned, carries with it “the carcinogenic risk of UV light.” Other sources of vitamin D, like cheap oral supplements, more safely and accurately maintain healthy vitamin D levels.
Fisher concluded that because persistent UV- seeking appears to be addictive, reducing skin-cancer risks may require “actively confronting” hazardous behavior like indoor tanning. Although both the CDC and the FDA have indicted indoor tanning as highly dangerous, Fisher fears the “passive risk-messages” we’re using aren’t good enough. In other words, we need to use much scarier warnings to let addicted sun-lovers know how destructive their behavior is.
At the same time, there’s new information that not enough sun exposure can also be risky. The July issue of the Journal of Internal Medicine reported on a study conducted in Sweden suggesting that the CDC guidelines may be too restrictive in regions with limited sunshine. The Swedes tracked sun exposure in 30,000 light-skinned Swedish women ages 25 to 64 from 1990 to 1992, gathering a wealth of information, including time spent sunbathing and the use of tanning beds. When national statistics were later reviewed in 2011, it appeared that women who got the most sun had the greatest risk of developing skin cancer. No big surprise there. But it was surprising that women who avoided the sun were twice as likely to die from any cause, including skin cancer. The risk of dying from all causes was twice as great among the sun-avoiders and 40 percent higher in those with moderate sun exposure. The study didn’t include information on blood levels of vitamin D or the use of vitamin D supplements, but its results may actually support the Harvard conclusions. If exposure to sunshine leads to better health overall, maybe that’s because sunshine leads to higher vitamin D levels. If so, our focus should probably shift to ensuring that everyone gets enough vitamin D, regardless of sun exposure.
Finally, if you’re a confirmed sun-lover, you might want to know about another study, this one conducted at Northwestern Medical School. Researchers found that people who are exposed to even moderately bright morning light have a significantly lower body mass index (BMI, based on height and weight) than those who had their first exposure later in the day. According to lead author Kathryn Reid, the earlier the light exposure, the lower one’s BMI. Senior author Phyllis Zee, director of the school’s Sleep and Circadian Rhythms Research Program, noted that light is the “most potent agent to synchronize your internal body clock,” regulating circadian rhythms, which in turn “regulate energy balance.” In short, if you want to lower your BMI, you might want to get outside between 8 a.m. and noon (20 to 30 minutes should do it).
Summing up, “I can see clearly now….”
For people like me (a redhead with extremely light skin), the less sun exposure the better. Although I love being outside on a sunny day, I use lots of sunscreen and stay in the shade as much I can. I make up the vitamin D deficit by taking an inexpensive supplement every morning.
For everyone else, the studies pretty much reinforce what we already knew. They suggest keeping your sun exposure within bounds. Some time in the sun is fine, so long as you use some sunscreen. Avoid overdoing your exposure via indoor tanning, and otherwise avoid addictive sun-seeking behavior. But one way or another, don’t forget to get some vitamin D.

Hey There, Handsome!

Hey, handsome!  You know who you are.  You’re a charitable donor to at least one worthy cause you support.

Say what? 

In a recent article in The Wall Street Journal, Arthur C. Brooks, the head of a nonprofit organization, accumulated a wealth of data to support the conclusion that giving to charity makes us happier, healthier, and yes, even better-looking.

First, according to one study cited by Brooks, happiness and giving are strongly correlated.  A survey by the University of Chicago showed that charitable givers are 43% more likely to say they are “very happy” than non-givers.  By contrast, non-givers are 3.5 times more likely to say they are “not happy at all.”  Wow!

But is it really charitable giving that makes us happier, or is it the reverse?  Another study provided one answer.  Researchers from Harvard and the University of British Columbia found that the amount of money subjects spent on themselves was “inconsequential for happiness,” but spending on others resulted in significant gains in happiness. 

In another study, University of Oregon researchers asked people to divide $100 between a food pantry and their own wallets.  The researchers used a brain-scanner to see what happened.  It turned out that choosing the charitable option lighted up the brain’s center of pleasure and reward, the same center that lights up because of pleasurable music, addictive drugs, and a mother’s bond with her children.

Are we also healthier when we act in a charitable way?  Brooks cited several studies that say we are.  A University of Buffalo psychologist recently studied more than 800 residents of Detroit and found that volunteering for a charity significantly lowered the association between stressful life-events and death. 

Two studies conducted in California lent further support to this notion.  When researchers at Stanford and the Buck Institute for Research on Aging tracked nearly 2000 older Americans over a nine-year period, they found that the dedicated volunteers in the group were 56% more likely to have survived all nine years than non-volunteers who started out in identical health.  A study of teenagers yielded even more support.  In 2008, the University of California reported that altruistic teenagers were physically and mentally healthier later in their lives than their less generous peers.

And now we get to our most intriguing question:  Does being charitable do anything for the way you look?  Dutch and British researchers recently showed women college students one of three videos featuring the same good-looking actor.  In the first video, he gave generously to a man begging on the street.  In the second, he appeared to give only a little money.  In the third, the actor gave nothing to the panhandler.  The result? The more he gave, the more handsome he appeared to the women in the study.

Brooks concluded that this finding explains why men loosen their wallets in an attempt to impress women.  And he uncovered one more study to support his conclusion.  A 1999 experiment conducted by the University of Liverpool showed that “eager men” on first dates gave significantly more to a panhandler than men who were already in comfortable long-term relationships.

In short, giving generously to the causes we support really does appear to boost our well-being and our esteem—even our appearance–in the eyes of others.  Although I have reservations about some of the techniques used by charities to pry money from us (see “Why Am I Suddenly a Member?” found elsewhere on this blog), I wholeheartedly support charitable giving and volunteering on behalf of worthy causes. 

The charitable men in my life have always looked good to me, and as I’ve gotten older, I find they’re looking better and better.

As for me, in addition to my feeling good about giving, I now know that it helps me look good, too.

That reminds me…where’s my checkbook?