Category Archives: remembering

Waiting for a Vaccine

 

While the world, in the midst of a deadly pandemic, turns to science and medicine to find a vaccine that would make us all safe, I can’t help remembering a long-ago time in my life when the world faced another deadly disease.

And I vividly remember how a vaccine, the result of years of dedicated research, led to the triumphant defeat of that disease.

Covid-19 poses a special threat.  The U.S. has just surpassed one million cases, according to The Washington Post.  It’s a new and unknown virus that has baffled medical researchers, and those of us who wake up every day feeling OK are left wondering whether we’re asymptomatic carriers of the virus or just damned lucky.  So far.

Testing of the entire population is essential, as is the development of effective therapies for treating those who are diagnosed as positive.  But our ultimate salvation will come with the development of a vaccine.

Overwhelming everything else right now is an oppressive feeling of fear.  Fear that the slightest contact with the virus can cause a horrible assault on one’s body, possibly leading to a gruesome hospitalization and, finally, death.

I recognize that feeling of fear.  Anyone growing up in America in the late 1940s and the early 1950s will recognize it.

Those of us who were conscious at that time remember the scourge of polio.  Some may have memories of that time that are as vivid as mine.  Others may have suppressed the ugly memories associated with the fear of polio.  And although the fear caused by Covid-19 today is infinitely worse, the fear of polio was in many ways the same.

People were aware of the disease called polio—the common name for poliomyelitis (originally and mistakenly called infantile paralysis; it didn’t affect only the young) — for a long time.  It was noted as early as the 19th century, and in 1908 two scientists identified a virus as its cause.

Before polio vaccines were available,  outbreaks in the U.S. caused more than 15,000 cases of paralysis every year.  In the late 1940s, these outbreaks increased in frequency and size, resulting in an average of 35,000 victims of paralysis each year.  Parents feared letting their children go outside, especially in the summer, when the virus seemed to peak, and some public health official imposed quarantines.

Polio appeared in several different forms.  About 95% of the cases were asymptomatic.  Others were mild, causing ordinary virus-like symptoms, and most people recovered quickly.  But some victims contracted a more serious form of the disease.  They suffered temporary or permanent paralysis and even death.  Many survivors were disabled for life, and they became a visible reminder of the enormous toll polio took on children’s lives.

The polio virus is highly infectious, spreading through contact between people, generally entering the body through the mouth.  A cure for it has never been found, so the ultimate goal has always been prevention via a vaccine.  Thanks to the vaccine first developed in the 1950s by Jonas Salk, polio was eventually eliminated from the Western Hemisphere in 1994.  It continues to circulate in a few countries elsewhere in the world, where vaccination programs aim to eliminate these last pockets because there is always a risk that it can spread within non-vaccinated populations.

[When HIV-AIDS first appeared, it created the same sort of fear.  It was a new disease with an unknown cause, and this led to widespread fear.  There is still no vaccine, although research efforts continue.  Notably, Jonas Salk spent the last years of his life searching for a vaccine against AIDS.  Until there is a vaccine, the development of life-saving drugs has lessened fear of the disease.]

When I was growing up, polio was an omnipresent and very scary disease.  Every year, children and their parents received warnings from public health officials, especially in the summer.  We were warned against going to communal swimming pools and large gatherings where the virus might spread.

We saw images on TV of polio’s unlucky victims.  Even though TV images back then were in black and white, they were clear enough to show kids my age who were suddenly trapped inside a huge piece of machinery called an iron lung, watched over by nurses who attended to their basic needs while they struggled to breathe.  Then there were the images of young people valiantly trying to walk on crutches, as well as those confined to wheelchairs.  They were the lucky ones.  Because we knew that the disease also killed a lot of people.

So every summer, I worried about catching polio, and when colder weather returned each fall, I was grateful that I had survived one more summer without catching it.

I was too young to remember President Franklin D. Roosevelt, but I later learned that he had contracted polio in 1921 at the age of 39.  He had a serious case, causing paralysis, and although he was open about having had polio, he has been criticized for concealing how extensive his disability really was.

Roosevelt founded the National Foundation for Infantile Paralysis, and it soon became a charity called the March of Dimes.  The catch phrase “march of dimes” was coined by popular actor/comedian/singer Eddie Cantor, who worked vigorously on the campaign to raise funds for research.  Using a name like that of the well-known newsreel The March of Time, Cantor announced on a 1938 radio program that the March of Dimes would begin collecting dimes to support research into polio, as well as to help victims who survived the disease. (Because polio ultimately succumbed to a vaccine, the March of Dimes has evolved into an ongoing charity focused on the health of mothers and babies, specifically on preventing birth defects.)

Yes, polio was defeated by a vaccine.  For years, the March of Dimes funded medical research aimed at a vaccine, and one of the recipients of its funds was a young physician at the University Of Pittsburgh School Of Medicine named Jonas Salk.

Salk became a superhero when he announced on April 12, 1955, that his research had led to the creation of a vaccine that was “safe, effective, and potent.”

Salk had worked toward the goal of a vaccine for years, especially after 1947, when he was recruited to be the director of the school’s Virus Research Laboratory.  There he created a vaccine composed of “killed” polio virus.  He first administered it to volunteers who included himself, his wife, and their children.  All of them developed anti-polio antibodies and experienced no negative reactions to the vaccine. Then, in 1954, a massive field trial tested the vaccine on over one million children between six and nine, allowing Salk to make his astonishing announcement in 1955.

I remember the day I first learned about the Salk vaccine. It was earthshaking.  It changed everything.  It represented a tremendous scientific breakthrough that, over time, relieved the anxiety of millions of American children and their parents.

But it wasn’t immediately available.  It took about two years before enough of the vaccine was produced to make it available to everyone, and the number of polio cases during those two years averaged 45,000.

Because we couldn’t get injections of the vaccine for some time, the fear of polio lingered.  Before I could get my own injection, I recall sitting in my school gym one day, looking around at the other students, and wondering whether I might still catch it from one of them.

My reaction was eerily like John Kerry’s demand when he testified before a Senate committee in 1971:  “How do you ask a man to be the last man to die in Vietnam?”  I remember thinking how terrible it would be to be one of the last kids to catch polio when the vaccine already existed but I hadn’t been able to get it yet.

I eventually got my injection, and life changed irreversibly.  Never again would I live in fear of contracting polio.

In 1962, the Salk vaccine was replaced by Dr. Albert Sabin’s live attenuated vaccine, an orally-administered vaccine that was both easier to give and less expensive, and I soon received that as well.

(By the way, neither Salk nor Sabin patented their discoveries or earned any profits from them, preferring that their vaccines be made widely available at a low price rather than exploited by commercial entities like pharmaceutical companies.)

Today, confronting the Covid-19 virus, no thinking person can avoid the fear of becoming one of its victims.  But as scientists and medical doctors continue to search for a vaccine, I’m reminded of how long those of us who were children in the 1950s waited for that to happen.

Because the whole world is confronting this new and terrible virus, valiant efforts, much like those of Jonas Salk, are aimed at creating a “safe, effective and potent” vaccine.  And there are encouraging signs coming from different directions.  Scientists at Oxford University in the UK were already working on a vaccine to defeat another form of the coronavirus when Covid-19 reared its ugly head, and they have pivoted toward developing a possible vaccine to defeat the new threat.  Clinical trials may take place within the next few months.

Similarly, some Harvard researchers haven’t taken a day off since early January, working hard to develop a vaccine.  Along with the Center for Virology and Vaccine Research at the Beth Israel Deaconess Medical Center, this group plans to launch clinical trials in the fall.

While the world waits, let’s hope that a life-saving vaccine will appear much more quickly than the polio vaccine did.  With today’s improved technology, and a by-now long and successful history of creating vaccines to kill deadly viruses, maybe we can reach that goal very soon.  Only then, when we are all able to receive the benefits of an effective vaccine, will our lives truly begin to return to anything resembling “normal.”

Hats Off to…Hats!

 

I grew up in the midst of a hat-wearing era.  If you watch movies from the 1950s, you’ll see what I mean.  In both newsreels and Hollywood films, almost all of the grown-ups–in almost every walk of life–are wearing hats.

Of course, grown-ups occasionally doffed their hats.  On a vacation, at a beach, in a theater.  But when it really counted, and they wanted to be taken seriously, they wore hats.

Although factory and construction workers wore other kinds of hats at their jobs, white-collar men tended to wear fedoras.  Footage of men attending baseball games makes clear that, even at casual events, most men were wearing felt fedoras

Women tended to opt for a variety of stylish hats, many of which look pretty silly today.  Just take a look at photos of Eleanor Roosevelt.  As the wife and later widow of President Franklin D. Roosevelt, she’s frequently seen in headwear that was not only frilly but also far from flattering. (By contrast, photos of her younger self, sans hat, put her in a far more appealing light.)  Images of other women in frilly hats predominate in the photos of the time.

When did things begin to change?  Probably about the time that Senator John F. Kennedy became a popular media focus.  He was almost never photographed wearing a hat.   It wasn’t until his inauguration in January 1961, when he wore a top hat just like Ike’s, that he appeared in a formal grown-up’s hat.  (He notably doffed it when he gave his memorable speech.)

The popular TV series “Mad Men,” which appeared on TV from 2007 to 2015, illustrates this change.  When the series begins in March 1960, Don Draper wears a stylish fedora whenever he leaves the office.  But as the series moves through the ‘60s, he abandons his hat more and more.

The hat-wearing era clearly ended years ago.  Today a celebrity or fashion icon may occasionally be photographed in a trendy hat, but hats are no longer de rigueur.

I’ve never adopted the habit of wearing hats, with two major exceptions:  I wear warm fuzzy ones to cover my ears on chilly days, and I wear big-brimmed ones to shield my face from the sun.

But two years ago, the de Young Museum in San Francisco put together a brilliant exhibit highlighting the creation and wearing of women’s hats.  “Degas, Impressionism, and the Paris Millinery Trade” focused on the creative artists who worked as milliners in Paris during Degas’s era, as well as on the era’s hats themselves.

The Wall Street Journal described the exhibit as “groundbreaking,” an exhibit that revealed “a compelling and until now less widely known side” of the Impressionist painter Edgar Degas.

The exhibit brought together exquisite Degas paintings and exquisite French-made hats.  Paris, as the center of the fashion industry during Degas’s era, was also the center of the millinery world.  Around one thousand Parisian milliners created a rich and diverse array of hats.  Many of these milliners worked in a network of independent millinery shops that competed with the nearby grand department stores.

Hat-making, the display and sale of hats, and the wearing of hats in belle époque Paris—all of these fascinated the Impressionist painters who focused on urban life in the City of Light.  Degas had a particular affinity for millinery, and he would often return to the subject—featuring both the creators, who ranged from prestigious designers to the “errand girls” who delivered hats to their new owners, and the elite consumers of these hats.  This exhibit was the first to display all of his millinery paintings in one place.

The exhibit also included display cases filled with French-made hats from the period, noting that they were sculptural art objects in their own right.  This headwear came from museums that collect hats as part of their costume collections.  Museums like the Chicago History Museum and the Fine Arts Museums of San Francisco contributed wonderful examples from this fabulous era of women’s decorative headwear.

When I saw this exhibit, I was thrilled by it.  It also became a powerful reminder of a childhood memory I’d nearly forgotten.  Standing in front of Degas’s paintings of milliners, I suddenly remembered going to a millinery shop in downtown Chicago with my mother when I was about 8 or 10.  Although my mother never had the financial assets to become an affluent consumer of fashion, she was acutely aware of fashion trends.  Within the bounds of my parents’ limited resources, Mom carved out a way to dress as stylishly as their funds allowed.

On this occasion, Mom must have felt financially secure enough to travel downtown and purchase a new hat styled just for her.  I was her lucky companion that day, creating a vivid memory of our shopping trip.

We found the millinery shop somewhere in a building on Randolph Street, a block or two west of the gigantic Marshall Field’s store on State Street.  We rode in an elevator to a floor above ground level and alighted to arrive at the cheerful shop, its big windows letting in a great deal of natural light.  Mom sat in a chair that faced a mirror while the milliner offered her several different styles to choose from.

Mom chose a white straw hat with blue flowers.  It was a delightful style that suited her perfectly.  Today I’d describe it as a cross between a cloche and a very small sunhat:  a straw cloche with a brim.  Not the kind of cloche that fits closely around the face, but one with a small brim that framed Mom’s face and set it off in a charming way.  Mom and the milliner conferred, possibly even turned to me to get my opinion, and made a final decision to select that hat, adding the lovely blue flowers in exactly the right place.

Mom clearly felt pretty when she wore that hat.  She went on to wear it many times, and whenever she did, I was always happy that I’d been with her on the day she chose it.  Even though Mom couldn’t purchase an elegant French-designed hat like those featured at art museums, she had her very own millinery-shop hat designed just for her.

She treasured that hat.  So did I.

 

 

The Demise of the Flip Chair

It’s gone.  The not-so-badly worn, crumbs-in-its cracks, cocoa-brown chair faded in spots by the sun.  Our venerable flip chair is gone.

The flip chair followed us from the day I first found it on the spiffy North Shore of Chicago to a student’s studio apartment in DC.  And later, from three different apartments in Cambridge, Mass., to a charming one-bedroom in San Francisco.

And now it’s finally gone.

The chair served us well.  I discovered it at an estate sale in a posh section of Winnetka, Illinois, inside a grand house on a private road near the lake.  It was in perfect condition, and I thought it would be useful as an extra chair, just right for my daughters’ sleepover guests because it could flip out from its chair-like position into a bed.  A single-size bed that would turn out to be quite comfy.

One of my daughters first used it when her friend Katie stayed overnight and slept on the flipped-out chair.  Katie was a nice young girl, but she wasn’t the sharpest knife in the drawer.  After she went home, we found she’d left behind a copy of Teen Beat magazine.  My daughters, who didn’t relate to Teen Beat’s focus on vapid teenage idols, leafed through it, and none of us could help laughing when we saw that Katie had underlined certain stories.  Underlining stories in Teen Beat?  Our scoffing reaction was probably unkind, but we made sure that Katie never knew.  I think we called and offered to return her magazine, but I don’t think she took us up on it.

Other young friends slept on the chair once in a while, so we held onto it, figuring it might continue to be useful.  It finally justified its existence years later, when my younger daughter (I’ll call her Laurie) left to study law at Georgetown in DC.  We rented an SUV, stuffed it with her possessions, and stuck the flip chair into the mix.  When we arrived, it happily fit into the studio apartment she rented in Dupont Circle, and I slept on it myself a couple of times.  It was comfy indeed.

After law school, Laurie began work as the law clerk for a judge in Boston and rented an apartment in Cambridge.  The flip chair joined her there, and it went on to reside in two other apartments in Cambridge before Laurie moved to a one-bedroom in San Francisco.  There, placed next to a window in her living room, the chair basked in the California sun, its color fading.

I sat on it occasionally, but it wasn’t a great chair for sitting.  We clung to it, thinking it might serve once again as an extra bed for visitors.  But things changed dramatically about a year ago when Laurie’s new baby arrived on the scene.  The flip chair stayed in its place by the window, continuing to fade, while no one ever used it as a bed.

As the year went along, it became clear that Laurie needed to make room for some essential things for her baby.  Some of the old stuff had to go.  Beginning with two skinny chairs and a dented metal wardrobe, then a creaky IKEA chest of drawers and an unwieldy suitcase—all were set outside for takers driving by her apartment building.  And finally, the bell tolled for the flip chair.

Two days ago, Laurie shoved the flip chair into her elevator and carried it to the sidewalk outside her building, where a lucky scavenger could seize it and get a few more years out of it.  In its place is a large play yard for the baby, filled with a heap of his books and toys.  Clearly a much better use of the space where the flip chair once sat.

And so we said goodbye to the valued but largely ignored flip chair.  It won’t be missed, but it will be remembered as a quasi-member of the family, one whose tenure in our homes had finally come to an end.

Remembering Stuff

Are you able to remember stuff pretty well?  If you learned that stuff quickly, you have a very good chance of retaining it.  Even if you spent less time studying it than you might have.

These conclusions arise from a new study by psychologists at Washington University in St. Louis.   According to its lead author, Christopher L. Zerr, “Quicker learning appears to be more durable learning.”

The study, published in the journal Psychological Science, tried a different way to gauge differences in how quickly and well people learn and retain information.  Using word-pairs that paired English with a difficult-to-learn language, Lithuanian, the researchers created a “learning-efficiency score” for each participant.

“In each case, initial learning speed proved to be a strong predictor of long-term retention,” said senior author Kathleen B. McDermott, professor of psychological and brain sciences at Washington University.

46 of the participants returned for a follow-up study three years later.  The results confirmed the earlier study’s results.

What explains this outcome?  The researchers suggest two possibilities.

First, individuals may differ because those with better attention-control can be more effective while learning material, thus avoiding distraction and forgetting.  Another explanation:  efficient learners use more effective learning strategies, like using a key word to relate two words in a pair.

The researchers don’t think their job is done.  Instead, they’d like to see future research on learning efficiency that would have an impact in educational and clinical settings.

The goal is to be able to teach students how to be efficient learners, and to forestall the effects of disease, aging, and neuropsychological disorders on learning and retention.

Conclusion:  If you’ve always been a quick learner, that’s probably stood you in good stead, enabling you to remember stuff you learned quickly in the first place.

 

[This blog post is not the one I originally intended to write this month, when I planned to focus on how important it is to vote in the midterm elections in November.  Publishing my new novel, RED DIANA, this month has kept me from writing that post, but I hope to publish it at some point.  It would be something of a reprise of a post I published in September 2014, “What Women Need to Do.”]