Tag Archives: environment

Hand-washing and drying–the right way–can save your life

The flu has hit the U.S., and hit it hard.  We’ve already seen flu-related deaths.  And now we confront a serious new threat, the coronavirus.

There’s no guarantee that this year’s flu vaccine is as effective as we would like, and right now we have no vaccine or other medical means to avoid the coronavirus.  So we need to employ other ways to contain the spread of the flu and other dangerous infections.

One simple way to foil all of these infections is to wash our hands often, and to do it right.  The Centers for Disease Control and Prevention have cautioned that to avoid the flu, we should “stay away from sick people,” adding it’s “also important to wash hands often with soap and water.”

On February 9 of this year, The New York Times repeated this message, noting that “[h]ealth professionals say washing hands with soap and water is the most effective line of defense against colds, flu and other illnesses.”  In the fight against the coronavirus, the CDC has once again reminded us of the importance of hand-washing, stating that it “can reduce the risk of respiratory infections by 16 percent.”

BUT one aspect of hand-washing is frequently overlooked:  Once we’ve washed our hands, how do we dry them?

The goal of hand-washing is to stop the spread of bacteria and viruses.  But when we wash our hands in public places, we don’t always encounter the best way to dry them. 

Restaurants, stores, theaters, museums, and other institutions offering restrooms for their patrons generally confront us with only one way to dry our hands:  paper towels OR air blowers.  A few establishments offer both, giving us a choice, but most do not.

I’m a strong proponent of paper towels, and my position has garnered support from an epidemiologist at the Mayo Clinic, Rodney Lee Thompson.

According to a story in The Wall Street Journal a few years ago, the Mayo Clinic published a comprehensive study of every known hand-washing study done since 1970.  The conclusion?  Drying one’s skin is essential to staving off bacteria, and paper towels are better at that than air blowers.

Why?  Paper towels are more efficient, they don’t splatter germs, they won’t dry out your skin, and most people prefer them (and therefore are more likely to wash their hands in the first place).

Thompson’s own study was included in the overall study, and he concurred with its conclusions.  He observed people washing their hands at places like sports stadiums.  “The trouble with blowers,” he said, is that “they take so long.”  Most people dry their hands for a short time, then “wipe them on their dirty jeans, or open the door with their still-wet hands.”

Besides being time-consuming, most blowers are extremely noisy.  Their decibel level can be deafening.  Like Thompson, I think these noisy and inefficient blowers “turn people off.”

But there’s “no downside to the paper towel,” either psychologically or environmentally.  Thompson stated that electric blowers use more energy than producing a paper towel, so they don’t appear to benefit the environment either.

The air-blower industry argues that blowers reduce bacterial transmission, but studies show that the opposite is true.  These studies found that blowers tend to spread bacteria from 3 to 6 feet.  To keep bacteria from spreading, Thompson urged using a paper towel to dry your hands, opening the restroom door with it, then throwing it into the trash.

An episode of the TV series “Mythbusters” provided additional evidence to support Thompson’s conclusions.  The results of tests conducted on this program, aired in 2013, demonstrated that paper towels are more effective at removing bacteria from one’s hands and that air blowers spread more bacteria around the blower area.

In San Francisco, where I live, many restrooms have posted signs stating that they’re composting paper towels to reduce waste.  So, because San Francisco has an ambitious composting scheme, we’re not adding paper towels to our landfills or recycling bins.  Other cities may already be doing the same, and still others will undoubtedly follow.

Because I strongly advocate replacing air blowers with paper towels in public restrooms, I think our political leaders should pay attention to this issue.  If they conclude, as overwhelming evidence suggests, that paper towels are better both for our health and for the environment, they can enact local ordinances requiring that public restrooms use paper towels instead of air blowers.  State legislation would lead to an even better outcome.

A transition period would allow the temporary use of blowers until paper towels could be installed.

If you agree with this position, we can ourselves take action by asking those who manage the restrooms we frequent to adopt the use of paper towels, if they haven’t done so already.

Paper towels or air blowers?  The answer, my friend, is blowin’ in the wind.  The answer is blowin’ in the wind.

 

Coal: A Personal History

It’s January, and much of the country is confronting freezing temperatures, snow, and ice.  I live in San Francisco now, but I vividly remember what life is like in cold-weather climates.

When I was growing up on the North Side of Chicago, my winter garb followed this pattern:

Skirt and blouse, socks (usually short enough to leave my legs largely bare), a woolen coat, and a silk scarf for my head.  Under my coat, I might have added a cardigan sweater.  But during the freezing cold days of winter (nearly every day during a normal Chicago winter), I was always COLD—when I was outside, that is.

My parents were caring and loving, but they followed the norms of most middle-class parents in Chicago during that era.  No one questioned this attire.  I recall shivering whenever our family ventured outside for a special event during the winter.  I especially remember the excitement of going downtown to see the first showing of Disney’s “Cinderella.”  Daddy parked our Chevy at an outdoor parking lot blocks from the theater on State Street, and we bravely faced the winter winds as we made our way there on foot.  I remember being COLD.

School days were somewhat different.  On bitter cold days, girls were allowed to cover our legs, but only if we hung our Levi’s in our lockers when we arrived at school.  We may have added mufflers around our heads and necks to create just a little more warmth as we walked blocks and blocks to school in the morning, back home for lunch, then returning to school for the afternoon.

Looking back, I can’t help wondering why it never occurred to our parents to clothe us more warmly.  Weren’t they aware of the warmer winter clothing worn elsewhere?  One reason that we didn’t adopt warmer winter garb–like thermal underwear, or down jackets, or ski parkas–may have been a lack of awareness that they existed.  Or the answer may have been even simplerthe abundance of coal.

Inside, we were never cold.  Why?  Because heating with coal was ubiquitous.  It heated our apartment buildings, our houses, our schools, our stores, our movie theaters, our libraries, our public buildings, and almost everywhere else.  Radiators heated by coal hissed all winter long.  The result?  Overheated air.

Despite the bleak winter outside, inside I was never cold.  On the contrary, I was probably much too warm in the overheated spaces we inhabited.

Until I was 12, we lived in an apartment with lots of windows.  In winter the radiators were always blazing hot, so hot that we never felt the cold air outside.  The window glass would be covered in condensed moisture, a product of the intensely heated air, and I remember drawing funny faces on the glass that annoyed my scrupulous-housekeeper mother.

Where did all that heat come from?  I never questioned its ultimate source.

I later learned that it was extracted from deep beneath the earth.  But what happened to it above ground was no secret.  More than once, I watched trucks pull up outside my apartment building to deliver large quantities of coal.  The driver would set up a chute that sent the coal directly into the basement, where all those lumps of coal must have been shoveled into a big furnace.

Coal was the primary source of heat back then, and the environment suffered as a result.  After the coal was burned in the furnace, its ashes would be shoveled into bags.  Many of the ashes found their way into the environment.  They were, for example, used on pavements and streets to cope with snow and ice.

The residue from burning coal also led to other harmful results.  Every chimney spewed thick sooty smoke all winter, sending into the air the toxic particles that we all inhaled.

Coal was plentiful, cheap, and reliable.  And few people were able to choose alternatives like fireplaces and wood-burning furnaces (which presented their own problems).

Eventually, cleaner and more easily distributed forms of heating fuel displaced coal.  Residential use dropped, and according to one source, today it amounts to less than one percent of heating fuel.

But coal still plays a big part in our lives.  As Malcolm Turnbull, the former prime minister of Australia (which is currently suffering the consequences of climate change), wrote earlier this month in TIME magazine, the issue of “climate action” has been “hijacked by a toxic, climate-denying alliance of right-wing politics and media…, as well as vested business interests, especially in the coal industry.”  He added:  “Above all, we have to urgently stop burning coal and other fossil fuels.”

In her book Inconspicuous Consumption: the environmental impact you don’t know you have, Tatiana Schlossberg points out that we still get about one-third of our electricity from coal.  So “streaming your online video may be coal-powered.”  Using as her source a 2014 EPA publication, she notes that coal ash remains one of the largest industrial solid-waste streams in the country, largely under-regulated, ending up polluting groundwater, streams, lakes, and rivers across the country.

“As crazy as this might sound,” Schlossberg writes, watching your favorite episode of “The Office” might come at the expense of clean water for someone else.  She’s concerned that even though we know we need electricity to power our computers, we don’t realize that going online itself uses electricity, which often comes from fossil fuels.

Illinois is finally dealing with at least one result of its longtime dependence on coal.   Environmental groups like Earthjustice celebrated a big win in Illinois in 2019 when they helped win passage of milestone legislation strengthening rules for cleaning up the state’s coal-ash dumps.  In a special report, Earthjustice noted that coal ash, the toxic residue of burning coal, has been dumped nationwide into more than 1,000 unlined ponds and landfills, where it leaches into waterways and drinking water.

Illinois in particular has been severely impacted by coal ash.  It is belatedly overhauling its legacy of toxic coal waste and the resulting widespread pollution in groundwater near its 24 coal-ash dumpsites.  The new legislation funds coal-ash cleanup programs and requires polluters to set aside funds to ensure that they, not taxpayers, pay for closure and cleanup of coal-ash dumps.

Earthjustice rightfully trumpets its victory, which will now protect Illinois residents and its waters from future toxic pollution by coal ash.  But what about the legacy of the past, and what about the legacy of toxic coal particles that entered the air decades ago?

As an adult, I wonder about the huge quantities of coal dust I must have inhaled during every six-month-long Chicago winter that I lived through as a child.  I appear to have so far escaped adverse health consequences, but that could change at any time.

And I wonder about others in my generation.  How many of us have suffered or will suffer serious health problems as a result of drinking polluted water and inhaling toxic coal-dust particles?

I suspect that many in my generation have been unwilling victims of our decades-long dependence on coal.

 

 

How about Thanks AND Giving?

I was scanning the aisles at Trader Joe’s when I noticed one of its 99-cent greeting cards.

The message caught my eye:  “Let our lives be full of BOTH Thanks & Giving.”

It struck me as the perfect card for the Thanksgiving holiday.  I grabbed it and carried it off with the rest of my purchases, planning to bestow it on a loved one at our annual feast.

But while the card patiently awaits its presentation on the holiday, the message has stayed with me.  What better sentiment to express at this time of year?

Just when Thanksgiving is on our minds, we’re inundated by pleas for money from a variety of causes.  At the same time, we want to give holiday presents to loved ones, friends, and acquaintances.

My proposal:  Let’s focus on both giving thanks and just plain giving.

So, as we celebrate the Thanksgiving holiday this week, I’m keeping both in mind.

First, let’s give thanks for all of the good things in our lives.  If you have any of the following, you’re lucky indeed and should feel grateful:  Loving family, good health, caring friends, cheerful acquaintances, some degree of success in your profession or work of any kind, and achievement of (or progress toward) whatever goals you may have.

Next, if you’re financially able to assist a good cause (or many), this is a splendid time of year to send them gifts.  For most charitable and other good causes, a monetary gift in almost any amount is welcome.  So please think about opening your wallet, your checkbook, or your online ability to send funds, and make a gift to show that you support these groups.

You can also scour your home and donate usable items you no longer need to worthy groups that will pass them on to others.

Finally, giving presents to those you love and care about may also be important to you.  But try to keep the health of our planet in mind when you choose those gifts.

Good karma will come to you.  Or so I like to think.

HAPPY HOLIDAYS!

.

 

My Life as a Shopper

I have a new outlook on shopping.  I’m no longer shopping the way I used to.

Why?

I’ll start at the beginning.  My long history of shopping began when I was very young.

My parents were both immersed in retailing.  My mother’s parents immigrated to Chicago from Eastern Europe and, soon after arriving, opened a clothing store on Milwaukee Avenue.  Their enterprise evolved into a modest chain of women’s apparel stores, and throughout her life my mother was intimately involved in the business.  She embedded in me the ethos that shopping for new things, especially clothes, was a good thing.  Under her influence, I gave away countless wearable items of clothing in favor of getting something new, preferably something sold in one of her family’s stores.  (I later regretted departing with some of the perfectly good items I could have continued to wear for many more years.)

Even though my father received a degree in pharmacy from the University of Illinois, and he enjoyed some aspects of his work as a pharmacist, he was himself attracted to retailing.  At a young age, he opened his own drugstore on the South Side of Chicago (I treasure a black-and-white photo of him standing in front of his store’s window).  After marrying my mother, he spent a number of years working in her family’s business, and in the late ‘40s the two of them opened a women’s clothing boutique on Rush Street, a short distance from Oak Street, in a soon-to-be-trendy shopping area.  Ahead of its time, the boutique quickly folded, but Daddy never lost his taste for retailing.

In view of this history, I was fated to become a “shopper.”  After Daddy died when I was 12, our family wasn’t able to spend big wads of money on anything, including clothes.  But my mother’s inclination to buy new clothes never really ceased.

Thanks to generous scholarship and fellowship awards, I made my way through college and grad school on a miniscule budget.  I saved money by spending almost nothing, savoring the 99-cent dinner at Harkness Commons almost every night during law school to save money.  And because I began my legal career with a $6,000 annual salary as a federal judge’s law clerk and, as a lawyer, never pursued a high-paying job (I preferred to work on behalf of the poor, for example), I got by without big-time shopping.

Marriage brought little change at first.  My darling new husband also came from a modest background and was not a big spender, even when our salaries began to move up a bit.

But things eventually changed.  Higher salaries and the arrival of new retail chain stores featuring bargain prices made buying stuff much more tempting.  I needed presentable clothes for my new full-time jobs.  Our daughters needed to be garbed in clothes like those the other kids wore.  Our living room chairs from Sears began to look shabby, propelling us toward somewhat better home décor.

A raft of other changes led me to spend more time shopping.  My boring law-firm jobs were more tolerable if I could escape during my lunch hour and browse at nearby stores.  The rise of outlet malls made bargain shopping easier than ever.  And travels to new cities and countries inspired buying small, easily packable items, like books and jewelry.

After I moved to San Francisco, having jettisoned possessions I’d lived with for years in my former home, I needed to acquire new ones.  So there I was, buying furniture and kitchen equipment for my sunny new apartment.

At the same time, our consumption-driven culture continued to push buying more and more, including the “fast-fashion” that emerged, offering stylish clothes at a temptingly low price.

But this emphasis on acquiring new stuff, even low-priced stuff, has finally lost its appeal.

I’ve come to realize that I don’t need it.

My overall goal is to simplify my life.  This means giving away a lot of things I don’t need, like stacks of books I’ll never read and charming bric-a-brac that’s sitting on a shelf collecting dust.  Like clothes that a disadvantaged person needs more than I do.

My new focus:  First, use what I already have.  Next, do not buy anything new unless I absolutely need it.

Choosing not to acquire new clothes—in essence, reusing what I already have, adopting the slogan “shop your closet”–is a perfect example of my new outlook.

I’ve previously written about confining one’s new purchases to “reunion-worthy” clothes.  [Please see my blog post of October 12, 2017, advising readers to choose their purchases carefully, making sure that any clothes they buy are flattering enough to wear at a school reunion.]

But that doesn’t go far enough.  New purchases should be necessary.

I find that I’m not alone in adopting this approach.

Many millennials have eschewed buying consumer goods, opting for new experiences instead of new material things.  I guess I agree with the millennials’ outlook on this subject.

Here’s other evidence of this approach.  An article in The Guardian in July 2019 shouted “’Don’t feed the monster!’ The people who have stopped buying new clothes.”  Writer Paula Cocozza noted the growing number of people who love clothes but resist buying new ones because of the lack of their sustainability:  Many consumers she interviewed were switching to second-hand shopping so they would not perpetuate this consumption and waste.

Second-hand shopping has even taken off online.  In September, the San Francisco Chronicle noted the “wave of new resale apps and marketplaces” adding to longtime resale giants like eBay.  At the same time, The New York Times, covering Fashion Week in Milan, wrote that there was “a lot of talk about sustainability over the last two weeks of collections, and about fashion’s role in the climate crisis.”  The Times added:  “the idea of creating clothes that last—that people want to buy and actually keep, keep wearing and never throw out, recycle or resell”—had become an important part of that subject.  It quoted Miuccia Prada, doyenne of the high-end clothing firm Prada:  “we need to do less.  There is too much fashion, too much clothes, too much of everything.”

Enter Tatiana Schlossberg and her new book, Inconspicuous consumption:  the environmental impact you don’t know you have (2019).  In the middle of an absorbing chapter titled Fashion, she notes that “There’s something appealing about being able to buy really cheap, fashionable clothing [..,] but it has given us a false sense of inexpensiveness.  It’s not only that the clothes are cheap; it’s that no one is paying for the long-term costs of the waste we create just from buying as much as we can afford….”

Some scholars have specifically focused on this issue, the “overabundance of fast fashion—readily available, inexpensively made new clothing,” because it has created “an environmental and social justice crisis.”  Christine Ekenga, an assistant professor at Washington University in St. Louis, has co-authored a paper focused on the “global environmental injustice of fast fashion,” asserting that the fast-fashion supply chain has created a dilemma.  While consumers can buy more clothes for less, those who work in or live near textile-manufacturing bear a disproportionate burden of environmental health hazards.  Further, millions of tons of textile waste sit in landfills and other settings, hurting low-income countries that produce many of these clothes.  In the U.S., about 85 percent of the clothing Americans consume–nearly 80 pounds per American per year–is sent to landfills as solid waste.  [See “The Global Environmental Injustice of Fast Fashion” in the journal Environmental Health.]

A high-profile public figure had an epiphany along the same lines that should influence all of us.  The late Doug Tompkins was one of the founders of The North Face and later moved on to help establish the apparel chain Esprit.  At the height of Esprit’s success, he sold his stake in the company for about $150 million and moved to Chile, where he embraced a whole new outlook on life and adopted an important new emphasis on ecology.  He bought up properties for conservation purposes, in this way “paying my rent for living on the planet.”  Most tellingly, he said, “I left that world of making stuff that nobody really needed because I realized that all of this needless overconsumption is one of the driving forces of the [environmental] crisis, the mother of all crises.”  [Sierra magazine, September/October 2019.]

Author Marie Kondo fits in here.  She has earned fame as a de-cluttering expert, helping people who feel overwhelmed with too much stuff to tidy up their homes.  Her focus is on reducing clutter that’s already there, so she doesn’t zero in on new purchases.  But I applaud her overall outlook.  As part of de-cluttering, she advises:  As you consider keeping or letting go of an item, hold it in your hands and ask:  “Does this item bring me joy?”  This concept of ensuring that an item brings you joy could apply to new purchases as well, so long as the item bringing you joy is also one you really need.

What should those of us enmeshed in our consumer culture do?  In The Wall Street Journal in July 2019, April Lane Benson, a “shopping-addiction-focused psychologist and the author of ‘To Buy or Not to Buy:  Why We Overshop and How to Stop’,” suggested that if a consumer is contemplating a purchase, she should ask herself six simple questions:  “Why am I here? How do I feel? Do I need this? What if I wait? How will I pay for it? Where will I put it?”

Benson’s list of questions is a good one.  Answering them could go a long way toward helping someone avoid making a compulsive purchase.  But let’s remember:  Benson is talking about a shopper already in a store, considering whether to buy something she’s already selected in her search for something new.  How many shoppers will interrupt a shopping trip like that to answer Benson’s questions?

I suggest a much more ambitious scheme:  Simply resolve not to buy anything you don’t need!

My 11-year-old granddaughter has the right idea:  She’s a minimalist who has rejected any number of gifts from me, including some fetching new clothes, telling me she doesn’t need them.

When I reflect on my life as a shopper, I now understand why and how I became the shopper I did.  Perhaps, in light of my family history and the increasingly consumption-driven culture I’ve lived through, I didn’t really have an option.

But I have regrets:  I’ve wasted countless hours browsing in stores, looking through racks and poring over shelves for things to buy, much of which I didn’t need, then spending additional hours returning some of the things I had just purchased.

These are hours I could have spent far more wisely.  Pursuing my creative work, exercising more often and more vigorously, doing more to help those in need.

Readers:  Please don’t make the mistakes I have.  Adopt my new philosophy.  You’ll have many more hours in your life to pursue far more rewarding goals than acquiring consumer goods you don’t really need.

 

 

 

The Old Man and the Movies

The Sundance Kid rides again!  Not on horseback but in a 1970s sedan.

In his most recent film (and perhaps his last), The Old Man and the Gun, Robert Redford plays a charming real-life bank robber.  Announcing his retirement from acting, he told Ruthe Stein of the San Francisco Chronicle that he chose the part because he identified with the bank robber’s rebellious spirit, and he wanted his last film to be “quirky and upbeat and fun.”

I have a special fondness for Redford that goes back to his role in his first memorable film, Butch Cassidy and the Sundance Kid.  Redford has called it the “first real film experience I ever had” and “the most fun on any film I’ve had.  It changed my life.”

When I saw the film in Chicago shortly after its release, I was struck by the performances of both Paul Newman (my perennial favorite) as Butch Cassidy and newcomer Redford as the Sundance Kid.

Unbeknown to me, there was a real live double of the Sundance Kid out there, waiting to meet me when I moved to LA a short time later:  my soon-to-be husband.  Once he added a mustache to his otherwise great looks, his resemblance to Redford in that film was uncanny, and I dubbed him the Sundance Kid.  I even acquired a poster of Redford in that role to affix to my office wall as a reminder of my new-found love.

The 1969 film, now fifty years old, holds up very well.  In perhaps its most memorable scene, the two leading men plunge from a cliff into roiling waters below, shouting a now more commonly accepted expletive for probably the first time in movie history.

Newman and Redford play leaders of the “Hole in the Wall Gang,” a group that robs banks, successfully for the most part, until robbing a train gets them into serious trouble.  They alienate Mr. E. H. Harrison of the Union Pacific Railroad, who hires special trackers who relentlessly follow Butch and Sundance.

An endearing scene takes place when the two men approach the home of Etta Place, Sundance’s wife.  News stories have alarmed Etta.  “The papers said they had you.  They said you were dead.”  Sundance’s first reaction:  “Don’t make a big thing of it.”  He pauses and reflects.  Then he says, “No.  Make a big thing of it.”  And they enthusiastically embrace.

Redford’s brilliant career includes a large number of notable Hollywood films.  It’s easy for me to name some favorites:  Downhill Racer in 1969, The Candidate in 1972, The Way We Were and The Sting in 1973, All the President’s Men in 1974, The Natural in 1984, and Out of Africa in 1985.  (A few of these especially resonate with me.)  And in All is Lost, as recently as 2013, Redford shines as an older man on the verge of dying alone in troubled ocean waters. Outstanding performances, each and every one.

In recent years, as I became an active supporter of NRDC (the Natural Resources Defense Council), an entity vigorously working on behalf of the environment, I began hearing from Redford, who aligned himself with NRDC’s goals and requested additional donations.  I commend him for his strong support for protecting the future of our country and our planet.  His efforts on behalf of the environment seem even more critical now, as we face increasingly dire problems caused by climate change.

As for Redford’s movie career, my hope is that he chooses not to retire.  Most movie-goers would welcome seeing new films that include him, even in a small role.  In the meantime, I encourage every film buff to see The Old Man and the Gun.  Featuring a number of brief scenes from his earlier movies (plugged into the movie by director David Lowery), the film is a great reminder of a storied Hollywood career.  A career that began with the Sundance Kid.

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

The Last Straw(s)

A crusade against plastic drinking straws?  Huh?

At first glance, it may strike you as frivolous.  But it’s not.  In fact, it’s pretty darned serious.

In California, the city of Berkeley may kick off such a crusade.   In June, the city council directed its staff to research what would be California’s first city ordinance prohibiting the use of plastic drinking straws in bars, restaurants, and coffee shops.

Berkeley is responding to efforts by nonprofit groups like the Surfrider Foundation that want to eliminate a significant source of pollution in our oceans, lakes, and other bodies of water. According to the conservation group Save the Bay, the annual cleanup days held on California beaches have found that plastic straws and stirrers are the sixth most common kind of litter.  If they’re on our beaches, they’re flowing into the San Francisco Bay, into the Pacific Ocean, and ultimately into oceans all over the world.

As City Councilwoman Sophie Hahn, a co-author of the proposal to study the ban, has noted, “They are not biodegradable, and there are alternatives.”

I’ve been told that plastic straws aren’t recyclable, either.  So whenever I find myself using a plastic straw to slurp my drink, I conscientiously separate my waste:  my can of Coke Zero goes into the recycling bin; my plastic straw goes into the landfill bin.  This is nuts.  Banning plastic straws in favor of paper ones is the answer.

Realistically, it may be a tough fight to ban plastic straws because business interests (like the Monster Straw Co. in Laguna Beach) want to keep making and selling them.  And business owners claim that they’re more cost-effective, leading customers to prefer them.  As Monster’s founder and owner, Natalie Buketov, told the SF Chronicle, “right now the public wants cheap plastic straws.”

Berkeley could vote on a ban by early 2018.

On the restaurant front, some chefs would like to see the end of plastic straws.  Spearheading a growing movement to steer eateries away from serving straws is Marcel Vigneron, owner-chef of Wolf Restaurant on Melrose Avenue in L.A.  Vigneron, who’s appeared on TV’s “Top Chef” and “Iron Chef,” is also an enthusiastic surfer, and he’s seen the impact of straw-pollution on the beaches and marine wildlife.  He likes the moniker “Straws Suck” to promote his effort to move away from straws, especially the play on words:  “You actually use straws to suck, and they suck because they pollute the oceans,” he told CBS in July.

Vigneron added that if a customer wants a straw, his restaurant has them.  But servers ask customers whether they want a straw instead of automatically putting them into customers’ drinks.  He notes that every day, 500 million straws are used in the U.S., and they could “fill up 127 school buses.”  He wants to change all that.

Drinking straws have a long history.  Their origins were apparently actual straw, or other straw-like grasses and plants.  The first paper straw, made from paper coated with paraffin wax, was patented in 1888 by Marvin Stone, who didn’t like the flavor of a rye grass straw added to his mint julep.  The “bendy” paper straw was patented in 1937.  But the plastic straw took off, along with many other plastic innovations, in the 1960s, and nowadays they’re difficult to avoid.

Campaigns like Surfrider’s have taken off because of mounting concern with plastic pollution.  Surfrider, which has also campaigned against other threats to our oceans, like plastic bags and cigarette butts, supports the “Straws Suck” effort, and according to author David Suzuki, Bacardi has joined with Surfrider in the movement to ban plastic straws.

Our neighbors to the north have already leaped ahead of California.  The town of Tofino in British Columbia claims that it mounted the very first “Straws Suck” campaign in 2016.  By Earth Day in April that year, almost every local business had banned plastic straws.  A fascinating story describing this effort appeared in the Vancouver Sun on April 22, 2016.

All of us in the U.S., indeed the world, need to pay attention to what plastic is doing to our environment.  “At the current rate, we are really headed toward a plastic planet,” according to the author of a study reported in the journal Science Advances, reported by AP in July.  Roland Geyer, an industrial ecologist at UC Santa Barbara, noted that there’s enough discarded plastic to bury Manhattan under more than 2 miles of trash.

Geyer used the plastics industry’s own data to find that the amount of plastics made and thrown out is accelerating.  In 2015, the world created more than twice as much as it made in 1998.

The plastics industry has fought back, relying on the standard of cost-effectiveness.  It claims that alternatives to plastic, like glass, paper, or aluminum, would require more energy to produce.  But even if that’s true, the energy difference in the case of items like drinking straws would probably be minimal.  If we substitute paper straws for plastic ones, the cost difference would likely be negligible, while the difference for our environment—eliminating all those plastic straws floating around in our waterways–could be significant.

Aside from city bans and eco-conscious restaurateurs, we need to challenge entities like Starbucks.  The mega-coffee-company and coffeehouse-chain prominently offers, even flaunts, brightly-colored plastic straws for customers sipping its cold drinks.  What’s worse:  they happily sell them to others!  Just check out the Starbucks straws for sale on Amazon.com.  Knowing what we know about plastic pollution, I think Starbucks’s choice to further pollute our environment by selling its plastic straws on the Internet is unforgivable.

At the end of the day, isn’t this really the last straw?