Category Archives: science

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Let’s keep going as long as we can

One thing everyone can agree on:  Every single day, we’re all getting older.

But we don’t have to let that indisputable fact stop us from doing what we want to do.

I just came across a spectacular example of a 96-year-old scientist who keeps on going and going and going….

By sheer coincidence, he’s a man who’s worked for decades in the field of battery speed and capacity.  And he’s very much more than good enough to serve as an astounding example of enduring optimism and hard work.

A Wall Street Journal story in August profiled John Goodenough, who helped invent the lithium-ion battery that’s used to recharge cell phones and a host of other electronic products.  By introducing lithium cobalt oxide to the inner workings of batteries in 1980, he made batteries not only more powerful but also more portable.

At age 96, he now wants to kill off his own creation by removing the cobalt that allowed his battery to charge faster and last longer.  In April 2018, he and three co-authors published research that may lead to a new battery that’s liquid-free and cobalt-free.

Initial research shows that the new battery could potentially double the energy density of the lithium-ion battery.  That would mean that an electric car, for example, could drive twice as far on one charge.

“My mission is to try to see if I can transform the battery world before I die,” Dr. Goodenough says.  He added that he has no plans to retire.  “When I’m no longer able to drive and I’m forced to go into a nursing home, then I suppose I will be retiring.”

Goodenough works in an untidy office at the University of Texas in Austin, where he’s a professor of engineering.  He begins work between 8 and 8:30 a.m., leaves around 6 p.m., and works from home throughout the weekend.

He hand-writes his research and doesn’t own a cell phone, rejecting the mobile technology that his batteries made possible.  His car is a 10-year-old Honda that he hopes will last as long as he does.

His motivation is to help electric cars wean society off its dependence on the combustion engine, like the one in his Honda.

“He is driven by scientific curiosity, and he really wants to do something for society with the science he does,” says one of his colleagues, another engineering professor at UT, Arumugam Manthiram.

Isn’t it heartening to come across someone like John Goodenough, a remarkable human being who refuses to quit?

His story energizes me.  Although I’m considerably younger than Goodenough, it encourages me to pursue my passions no matter how old I get.

Does his story energize you, too?

 

[This blog post is somewhat shorter than usual because I’m currently in the midst of publishing my third novel, RED DIANA.  I’m hoping it will be available soon at bookstores everywhere and on Amazon.com.]

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

A new book you may want to know about

There’s one thing we can all agree on:  Trying to stay healthy.

That’s why you may want to know about a new book, Killer diseases, modern-day epidemics:  Keys to stopping heart disease, diabetes, cancer, and obesity in their tracks, by Swarna Moldanado, PhD, MPH, and Alex Moldanado, MD.

In this extraordinary book, the authors have pulled together an invaluable compendium of both evidence and advice on how to stop the “killer diseases” they call “modern-day epidemics.”

First, using their accumulated wisdom and experience in public health, nursing science, and family medical practice, Swarna and Alex Moldanado offer the reader a wide array of scientific evidence.  Next, they present their well-thought-out conclusions on how this evidence supports their theories of how to combat the killer diseases that plague us today.

Their most compelling conclusion:  Lifestyle choices have an overwhelming impact on our health.  So although some individuals may suffer from diseases that are unavoidable, evidence points to the tremendous importance of lifestyle choices.

Specifically, the authors note that evidence “points to the fact that some of the most lethal cancers are attributable to lifestyle choices.”  Choosing to smoke tobacco or consume alcohol in excess are examples of the sort of risky lifestyle choices that can lead to this killer disease.

Similarly, cardiovascular diseases–diseases of the heart and blood vessels–share many common risk factors.  Clear evidence demonstrates that eating an unhealthy diet, a diet that includes too many saturated fats—fatty meats, baked goods, and certain dairy products—is a critical factor in the development of cardiovascular disease. The increasing size of food portions in our diet is another risk factor many people may not be aware of.

On the other hand, most of us are aware of the dangers of physical inactivity.  But knowledge of these dangers is not enough.  Many of us must change our lifestyle choices.  Those of us in sedentary careers, for example, must become much more physically active than our lifestyles lend themselves to.

Yes, the basics of this information appear frequently in the media.  But the Moldanados reveal a great deal of scientific evidence you might not know about.

Even more importantly, in Chapter 8, “Making and Keeping the Right Lifestyle Choices,” the authors step up to the plate in a big way.  Here they clearly and forcefully state their specific recommendations for succeeding in the fight against killer diseases.

Following these recommendations could lead all of us to a healthier and brighter outcome.

Kudos to the authors for collecting an enormous volume of evidence, clearly presenting it to us, and concluding with their invaluable recommendations.

No more excuses!  Let’s resolve to follow their advice and move in the right direction to help ensure our good health.

 

 

 

 

Pockets!

Women’s clothes should all have pockets. 

(A bit later in this post, I’ll explain why.)

I admit it.  I’m a pocket-freak.

When I shop for new pants, I don’t bother buying new pants, no matter how appealing, if they don’t have pockets.  Why?

Because when I formerly bought pants that didn’t have pockets, I discovered over time that I never wore them. They languished forever in a shameful pile of unworn clothes.

It became clear that I liked the benefits of wearing pants with pockets.  Why then would I buy new pants without pockets when those I already had were languishing unworn?

Result:  I simply don’t buy no-pocket pants anymore

Most jeans have pockets, often multiple pockets, and I like wearing them for that reason, among others.  (Please see “They’re My Blue Jeans, and I’ll Wear Them If I Want To,” published in this blog in May 2017.)

Most jackets, but not all, have pockets.  Why not?  They all need pockets.  How useful is a jacket if it doesn’t have even one pocket to stash your stuff?

Dresses and skirts should also have pockets.  Maybe an occasional event, like a fancy gala, seems to require a form-fitting dress that doesn’t have pockets.  But how many women actually go to galas like that?  Looking back over my lifetime of clothes-wearing, I can think of very few occasions when I had to wear a no-pocket dress.  As for skirts, I lump them in the same category as pants.  Unless you feel compelled for some bizarre reason to wear a skin-tight pencil skirt, what good is a skirt without pockets?

Cardigan sweaters, like jackets, should also have pockets.  So should robes.  Pajamas. Even nightgowns.  I wear nightgowns, and I relish being able to stick something like a facial tissue into the pocket of my nightgown!   You never know when you’re going to sneeze, right?

Did you ever watch a TV program called “Project Runway?”  It features largely unknown fashion designers competing for approval from judges, primarily high-profile insiders in the fashion industry.  Here’s what I’ve noticed when I’ve watched an occasional episode:  Whenever a competing designer puts pockets in her or his designs, the judges enthusiastically applaud that design.  They clearly recognize the value of pockets and the desire by women to wear clothes that include them.

(By the way, fake pockets are an abomination.  Why do designers think it’s a good idea to put a fake pocket on their designs?  Sewing what looks like a pocket but isn’t a real pocket adds insult to injury.  Either put a real pocket there, or forget the whole thing.  Fake pockets?  Boo!)

Despite the longing for pockets by women like me, it can be challenging to find women’s clothes with pockets.  Why?

Several women writers have speculated about this challenge, generally railing against sexist attitudes that have led to no-pocket clothing for women.

Those who’ve traced the evolution of pockets throughout history discovered that neither men nor women wore clothing with pockets until the 17th century.  Pockets in menswear began appearing in the late 1600s.  But women?  To carry anything, they were forced to wrap a sack with a string worn around their waists and tuck the sack under their petticoats.

These sacks eventually evolved into small purses called reticules that women would carry in their hands.  But reticules were so small that they limited what women could carry.  As the twentieth century loomed, women rebelled.  According to London’s Victoria and Albert Museum, dress patterns started to include instructions for sewing pockets into skirts.  And when women began wearing pants, they would finally have pockets.

But things soon switched back to no-pocket pants.  The fashion industry wasn’t a big fan of pockets, insisting on featuring “slimming” designs for women, while men’s clothes still had scads of pockets.  The result has been the rise of bigger and bigger handbags (interestingly, handbags are often called “pocketbooks” on the East Coast).

Enormous handbags create a tremendous burden for women.  Their size and weight can literally weigh a woman down, impeding her ability to move through her busy life the way men can.  (I’ve eschewed bulky handbags, often wearing a backpack instead.  Unfortunately, backpacks are not always appropriate in a particular setting.)

Today, many women are demanding pockets.  Some have advocated pockets with the specific goal of enabling women to carry their iPhones or other cell phones that way.  I’m a pocket-freak, but according to recent scientific research, cell phones emit dangerous radiation, and this kind of radiation exposure is a major risk to your health.  Some experts in the field have therefore advised against keeping a cell phone adjacent to your body.  In December 2017, the California Department of Public Health specifically warned against keeping a cell phone in your pocket.  So, in my view, advocating pockets for that reason is not a good idea.

We need pockets in our clothes for a much more important and fundamental reasonFreedom.

Pockets give women the kind of freedom men have:  The freedom to carry possessions close to their bodies, allowing them to reach for essentials like keys without fumbling through a clumsy handbag.

I propose a boycott on no-pocket clothes.  If enough women boycott no-pocket pants, for example, designers and manufacturers will have to pay attention.  Their new clothing lines will undoubtedly include more pockets.

I hereby pledge not to purchase any clothes without pockets.

Will you join me?

 

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

The Last Straw(s)

A crusade against plastic drinking straws?  Huh?

At first glance, it may strike you as frivolous.  But it’s not.  In fact, it’s pretty darned serious.

In California, the city of Berkeley may kick off such a crusade.   In June, the city council directed its staff to research what would be California’s first city ordinance prohibiting the use of plastic drinking straws in bars, restaurants, and coffee shops.

Berkeley is responding to efforts by nonprofit groups like the Surfrider Foundation that want to eliminate a significant source of pollution in our oceans, lakes, and other bodies of water. According to the conservation group Save the Bay, the annual cleanup days held on California beaches have found that plastic straws and stirrers are the sixth most common kind of litter.  If they’re on our beaches, they’re flowing into the San Francisco Bay, into the Pacific Ocean, and ultimately into oceans all over the world.

As City Councilwoman Sophie Hahn, a co-author of the proposal to study the ban, has noted, “They are not biodegradable, and there are alternatives.”

I’ve been told that plastic straws aren’t recyclable, either.  So whenever I find myself using a plastic straw to slurp my drink, I conscientiously separate my waste:  my can of Coke Zero goes into the recycling bin; my plastic straw goes into the landfill bin.  This is nuts.  Banning plastic straws in favor of paper ones is the answer.

Realistically, it may be a tough fight to ban plastic straws because business interests (like the Monster Straw Co. in Laguna Beach) want to keep making and selling them.  And business owners claim that they’re more cost-effective, leading customers to prefer them.  As Monster’s founder and owner, Natalie Buketov, told the SF Chronicle, “right now the public wants cheap plastic straws.”

Berkeley could vote on a ban by early 2018.

On the restaurant front, some chefs would like to see the end of plastic straws.  Spearheading a growing movement to steer eateries away from serving straws is Marcel Vigneron, owner-chef of Wolf Restaurant on Melrose Avenue in L.A.  Vigneron, who’s appeared on TV’s “Top Chef” and “Iron Chef,” is also an enthusiastic surfer, and he’s seen the impact of straw-pollution on the beaches and marine wildlife.  He likes the moniker “Straws Suck” to promote his effort to move away from straws, especially the play on words:  “You actually use straws to suck, and they suck because they pollute the oceans,” he told CBS in July.

Vigneron added that if a customer wants a straw, his restaurant has them.  But servers ask customers whether they want a straw instead of automatically putting them into customers’ drinks.  He notes that every day, 500 million straws are used in the U.S., and they could “fill up 127 school buses.”  He wants to change all that.

Drinking straws have a long history.  Their origins were apparently actual straw, or other straw-like grasses and plants.  The first paper straw, made from paper coated with paraffin wax, was patented in 1888 by Marvin Stone, who didn’t like the flavor of a rye grass straw added to his mint julep.  The “bendy” paper straw was patented in 1937.  But the plastic straw took off, along with many other plastic innovations, in the 1960s, and nowadays they’re difficult to avoid.

Campaigns like Surfrider’s have taken off because of mounting concern with plastic pollution.  Surfrider, which has also campaigned against other threats to our oceans, like plastic bags and cigarette butts, supports the “Straws Suck” effort, and according to author David Suzuki, Bacardi has joined with Surfrider in the movement to ban plastic straws.

Our neighbors to the north have already leaped ahead of California.  The town of Tofino in British Columbia claims that it mounted the very first “Straws Suck” campaign in 2016.  By Earth Day in April that year, almost every local business had banned plastic straws.  A fascinating story describing this effort appeared in the Vancouver Sun on April 22, 2016.

All of us in the U.S., indeed the world, need to pay attention to what plastic is doing to our environment.  “At the current rate, we are really headed toward a plastic planet,” according to the author of a study reported in the journal Science Advances, reported by AP in July.  Roland Geyer, an industrial ecologist at UC Santa Barbara, noted that there’s enough discarded plastic to bury Manhattan under more than 2 miles of trash.

Geyer used the plastics industry’s own data to find that the amount of plastics made and thrown out is accelerating.  In 2015, the world created more than twice as much as it made in 1998.

The plastics industry has fought back, relying on the standard of cost-effectiveness.  It claims that alternatives to plastic, like glass, paper, or aluminum, would require more energy to produce.  But even if that’s true, the energy difference in the case of items like drinking straws would probably be minimal.  If we substitute paper straws for plastic ones, the cost difference would likely be negligible, while the difference for our environment—eliminating all those plastic straws floating around in our waterways–could be significant.

Aside from city bans and eco-conscious restaurateurs, we need to challenge entities like Starbucks.  The mega-coffee-company and coffeehouse-chain prominently offers, even flaunts, brightly-colored plastic straws for customers sipping its cold drinks.  What’s worse:  they happily sell them to others!  Just check out the Starbucks straws for sale on Amazon.com.  Knowing what we know about plastic pollution, I think Starbucks’s choice to further pollute our environment by selling its plastic straws on the Internet is unforgivable.

At the end of the day, isn’t this really the last straw?

 

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]