Category Archives: politics

Is It Time to Resurrect the “Housedress”?

The HBO miniseries, “The Plot Against America,” which appeared earlier this year, focused on life in America in the early 1940s.  Adapted from the 2005 novel by Philip Roth, the storyline was terrifying, highlighting the possibility that a fascist anti-Semitic regime could assume control over politics in our country.

New York Times critic A.O. Scott, describing HBO’s adaptation as “mostly faithful” to the novel, observed that the world it portrayed looked familiar, yet different, to us today.  He noted in particular “the clothes” worn by the people inhabiting that world, as well as the cars, the cigarettes, and what he called “the household arrangements,” evoking a period “encrusted with…nostalgia.”

The series was, in my view, a stunning depiction of that era, along with a chilling prediction of what might have happened.  Thankfully, Roth’s fictional prediction never came true, and I hope it never will.

One thing I took away from the series was how authentically it created the images from that time.  I was born years later than both Philip Roth and his character, the 8-year-old Philip.  But I can recall images from the 1950s, and I’ve seen countless films dating from the 1940s and 1950s, as well as TV shows like “I Love Lucy.”

A couple of things in the series stand out.  First, people got their news from newspapers and the radio.  The leading characters appear in a number of scenes reading the daily newspapers that influenced their view of the world.  They also listened attentively to the radio for news and other information.  The radio broadcaster Walter Winchell even plays an important part in the story.

The other thing that stands out is the clothing worn by the characters in “Plot.”  Especially the women characters.  These women tended to have two types of wardrobes.  One represented the clothing they wore at home, where they generally focused on housecleaning, cooking, and tending to their children.  The other represented what they would wear when they left home, entering the outside world for a variety of reasons.

The wardrobe worn at home looked extremely familiar.  My mother clung to that wardrobe for decades.  She, like the women in “Plot,” wore housedresses at home.  These were cotton dresses, usually in a floral or other subdued print, that were either buttoned or wrapped around the body in some fashion.  In an era before pants became acceptable for women (Katharine Hepburn being a notable exception), women wore dresses or skirts, even to do housework at home.

Only when they left home, to go to somewhere like an office or a bank, did they garb themselves in other clothes.  In this wardrobe, they tended to wear stylish dresses made with non-cotton fabrics, or skirt suits with blouses, along with hats and white gloves. Working women employed in office-type settings (there were a few, like the character brilliantly played by Winona Ryder in “Plot”) wore these clothes to work every day. (Women employed in other settings of course wore clothes appropriate to their workplaces.)

Now, with most of us staying home for the most part, I wonder:  Is it time to resurrect the housedress?

Here are some reasons why it might be:

  1. Warmer weather is approaching, or may have already arrived, depending on where you live.
  2. Relying on heavy clothing like sweatshirts and sweatpants, which many of us have been relying on during our self-isolation at home, will become impractical because that clothing will be uncomfortably hot.
  3. Pajamas and nightgowns aren’t a good idea for all-day wear.  We should save them for bedtime, when we need to separate our daytime experience from the need to get some sleep.
  4. The housedress offers an inviting choice for women who want to stay comfortably at home, wearing cool cotton (or cotton-blend) dresses that allow them to move as comfortably as they do in sweat clothes, all day long.

I concede that comfortable shorts and t-shirts might fit the bill, for men as well as women.  But I suggest that women consider an alternative.  They may want to give housedresses a try.

Ideally, a woman will be able to choose from a wide range of cheerful fabric designs and colors.  If she can track down one that appeals to her, she just might be convinced by its comfort and then tempted to wear more of them.

I’ve already adopted my own version of the housedress.  I rummaged through one of my closets and found a few items I haven’t worn in years.  I’ve always called them “robes,” although they’ve also been called housecoats or other names.  My mother for some reason liked to call them “dusters.”  My husband’s aunt liked to wear what she called “snap coats.”

But in the big picture, we’re really talking about the same thing.  Cotton robes/dresses in a variety of designs and prints. Today they’re usually fastened with snaps.  Easy in, easy out.

And most of them have pockets!  (As I’ve written before, all women’s clothes should have pockets.)  [Please see my blog post “Pockets!” https://susanjustwrites.wordpress.com/2018/01/ ]

I plucked a couple of these out of my closet, some with the brand name Models Coats.  I had never even worn one of them.  (A tag was still attached, featuring the silly slogan, “If it’s not Models Coat…it’s not!”)  But I’ll wear it now.

By the way, I’ve checked “Models Coats” on the internet, and an amazing variety of “housedresses,” or whatever you choose to call them—Models Coats and other brands–is offered online.  So it appears that some women have been purchasing them all along.

Now here’s a bit of cultural history:  My mother kept her 1950s-style housedresses well into the 1990s.  I know that because I discovered them in her closet when we visited her Chicago apartment one cold winter day in the ‘90s.  Mom lived in a 1920s-era apartment building, filled with radiators that ensured overheated air in her apartment.  [Please see my blog post “Coal:  A Personal History,” discussing the overheated air that coal-based radiators chugged out:  https://susanjustwrites.wordpress.com/2020/01/29/coal-a-personal-history/ ]

My daughters and I had worn clothing appropriate for a cold winter day in Chicago.  But as we sat in Mom’s overheated living room, we began to peel off our sweaters and other warm duds.  (My husband didn’t do any peeling.  He was too smart to have dressed as warmly as we had.)

It finally occurred to me that Mom might have saved her housedresses from long ago.  Maybe she even continued to wear them.  So I searched her closet and found three of them.  My daughters and I promptly changed, and we immediately felt much better.  But when we caught sight of ourselves, we laughed ourselves silly.  We looked a lot like the model in a Wendy’s TV commercial we called “Russian fashion show.”

In our favorite Wendy’s commercial, dating from 1990, Russian music plays in the background while a hefty woman dressed in a military uniform announces the fashion show in a heavy Russian accent.  The “model” comes down the runway wearing “day wear,” “evening wear,” and “beachwear.”  What’s hilariously funny is that she wears the same drab dress, along with a matching babushka, in each setting.  For “evening wear,” the only change is that she waves a flashlight around.  And for “beachwear,” she’s clutching a beach ball.

Wendy’s used clever commercials like this one to promote their slogan:  “Having no choice is no fun,” clearly implying that Wendy’s offered choices its fast-food competitors didn’t.  I don’t know whether these commercials helped Wendy’s bottom line, but they certainly afforded our family many, many laughs.

[If you need some laughs right now, you can find these commercials on YouTube.  Just enter words like “Wendy’s TV commercials” and “Russian fashion show.”]

Mom’s housedresses weren’t as drab as the dress worn by the model in our favorite commercial.   They tended to feature brightly colored prints.  Admittedly, they weren’t examples of trend-setting fashion.  But they certainly were cool and comfortable

In our current crisis, we need to be creative and come up with new solutions to new problems.  For those women seeking something comfortable to wear, something different from what they’ve been wearing, colorful housedresses just might be the right choice.

A Snowy April 1st

On the morning of April 1st, The New York Times reported that the city had woken up to an April snowstorm, “with about 5 inches of snow expected to produce slushy streets and a tough morning commute.”  The storm followed a string of storms that had hit the East Coast in March with heavy snows and damaging winds.

This New York story about snow on April 1st reminded me of another April 1st snowstorm:  The one in Chicago that changed my life.

In the spring of 1970, I was already questioning whether I wanted to spend another year in Chicago.  My work at the Appellate and Test Case Division of the Chicago Legal Aid Bureau had its good points.  I was co-counsel with a lawyer at the Roger Baldwin Foundation of the ACLU (who happily became a lifelong friend) in a case challenging the restrictive Illinois abortion law, a law that made any abortion nearly impossible for all but the most affluent women in Illinois.  Our case was moving forward and had already secured a TRO allowing a teenage rape victim an emergency abortion.  A great legal victory!

But the rest of my life was at a standstill.  I was dating some of the men I’d met, but I hadn’t encountered anyone I wanted to pair up with.  In fact, I’d recently dumped a persistent suitor I found much too boring.  Relying on old friendships led to occasional lunches with both men and women I’d known in school, but the women were happily married and had limited time for a single woman friend.  I tried striking up friendships with other women as well as men, but so far that hadn’t expanded my social life very much.

I also haunted the Art Institute of Chicago, attending evening lectures and lunchtime events.  The art was exhilarating, but good times there were few.  When I turned up for an event one Sunday afternoon and left a few hours later, planning to take a bus home, I was surprised to see almost no one else on Michigan Avenue, leaving me feeling isolated and (in today’s parlance) somewhat creeped-out.  (In 1970 Chicago hadn’t yet embarked on the kind of Sunday shopping that would bring people downtown on a Sunday afternoon.)  Similarly, I bought tickets for a piano series at Symphony Hall, and a series of opera tickets, but again I many times felt alone among a group of strangers.

I still had lots of family in the area.  But being surrounded by family wasn’t exactly what I was looking for just then.

So although I was feeling somewhat wobbly about staying in Chicago, the question of where to settle instead loomed large.  When I’d left law school three years earlier and assumed a two-year clerkship with a federal judge in Chicago, I’d intended to head for Washington DC when my clerkship ended.  But in the interim Tricky Dick Nixon had lied his way into the White House, and I couldn’t abide the idea of moving there while he was in charge.

My thoughts then turned to California.  I’d briefly lived in Los Angeles during 8th grade (a story for another day) and very much wanted to stay, but my mother’s desire to return to Chicago after my father’s death won out.  Now I remembered how much I loved living in sunny California.  A February trip to Mexico had reinforced my thinking that I could happily live out my days in a warm-weather climate instead of slogging away in Chicago, winter after Chicago winter.

So I began making tentative efforts to seek out work in either LA or San Francisco, cities where I already had some good friends.

What happened on April 1st sealed the deal.  I’d made my way to work that morning despite the heavy snow that had fallen, and I took my usual ride home on a bus going down Michigan Avenue to where I lived just north of Oak Street.  The bus lumbered along, making its way through the snow-covered city, its major arteries by that time cleared by the city’s snow plows.  When the bus driver pulled up at the stop just across Lake Shore Drive from my apartment building, he opened the bus’s door, and I unsuspectingly descended the stairs to emerge outside.

Then, it happened.  I put a foot out the door, and it sank into a drift of snow as high as my knee.  I was wearing the miniskirts I favored back then, and my foot and leg were now stuck in the snow.  The bus abruptly closed its door, and I was left, stranded in a snowbank, forced to pull myself out of it and attempt to cross busy Lake Shore Drive.

On April 1st.

Then and there I resolved to leave Chicago.  No ifs, ands, or buts about it.  I made up my mind to leave the snow-ridden city and head for warmer climes.

And I did.  After a May trip to the sunny West Coast, where I interviewed for jobs in both Los Angeles and San Francisco (with kind friends hosting me in both cities), I wound up accepting a job offer at a poverty-law support center at UCLA law school and renting a furnished apartment just across Gayley Avenue from the campus.

The rest is (my personal) history.  I immediately loved my new home and my new job.  Welcomed by friends, both old and new (including my brand-new colleagues at UCLA), I was happy to have left Chicago and its dreary winters behind.  And six weeks after arriving in LA, I met the wonderful guy I married a few months later.

What happened next?  I’ll save that for still another day.  But here’s the take-away:  a snowstorm on April 1st changed my life.  Maybe it can change yours, too.

 

Who the Heck Knows?

I have a new catch phrase:  “Who the heck knows?”

I started using it last fall, and ever since then I’ve found that it applies to almost everything that might arise in the future.

I don’t claim originality, but here’s how I came up with it:

At a class reunion in October, I was asked to be part of a panel of law school classmates who had veered off the usual lawyer-track and now worked in a totally different area.

Specifically, I was asked to address a simple question:  Why did I leave my work as a lawyer/law professor and decide to focus primarily on writing?

First, I explained that I’d always loved writing, continued to write even while I worked as a lawyer, and left my law-related jobs when they no longer seemed meaningful.  I added that my move to San Francisco led to launching my blog and publishing my first two novels.

I concluded:

“If I stay healthy and my brain keeps functioning, I want to continue to write, with an increasing focus on memoirs….  I’ll keep putting a lot of this kind of stuff on my blog.  And maybe it will turn into a book or books someday.

“Who the heck knows?”

 

After I said all that, I realized that my final sentence was the perfect way to respond to almost any question about the future.

Here’s why it seems to me to apply to almost everything:

None of us knows what the next day will bring.  Still, we think about it.

In “Men Explain Things to Me,” the author Rebecca Solnit notes “that we don’t know what will happen next, and the unlikely and the unimaginable transpire quite regularly.”  She finds uncertainty hopeful, while viewing despair as “a form of certainty,” certainty that that “the future will be a lot like the present or will decline from it.”

Let’s cast certainty aside and agree, with Solnit, that uncertainty is hopeful.  Let’s go on to question what might happen in the uncertain future.

For example:

We wonder whether the midterm elections will change anything.

We wonder whether our kids will choose to follow our career choices or do something totally different.

We wonder whether our family history of a deadly disease will lead to having it ourselves.

We wonder whether to plan a trip to Peru.

We wonder whether we’re saving enough money for retirement.

We wonder how the U.S. Supreme Court will rule in an upcoming case.

We wonder what our hair will look like ten years from now.

We wonder what the weather will be like next week.

And we wonder what the current occupant of the White House will say or do regarding just about anything.

 

You may have an answer in mind, one that’s based on reason or knowledge or probability.   But if you’re uncertain…in almost every case, the best response is:  Who the heck knows?

If you’re stating this response to others, I suggest using “heck” instead of a word that might offend anyone.  It also lends a less serious tone to all of the unknowns out there, some of which are undoubtedly scary.

If you prefer to use a more serious tone, you can of course phrase things differently.

But I think I’ll stick with “Who the heck knows?”

Warning:  If you spend any time with me, you’ll probably hear me say it, again and again.

But then, who the heck knows?

Happy Holidays! Well, maybe…

 

As the greeting “Happy Holidays” hits your ears over and over during the holiday season, doesn’t it raise a question or two?

At a time when greed and acquisitiveness appear to be boundless, at least among certain segments of the American population, the most relevant questions seem to be:

  • Does money buy happiness?
  • If not, what does?

These questions have been the subject of countless studies.  Let’s review a few of the answers they’ve come up with.

To begin, exactly what is it that makes us “happy”?

A couple of articles published in the past two years in The Wall Street Journal—a publication certainly focused on the acquisition of money—summarized some results.

Wealth alone doesn’t guarantee a good life.  According to the Journal, what matters a lot more than a big income is how people spend it.  For instance, giving money away makes people much happier than spending it on themselves.  But when they do spend it on themselves, they’re a lot happier when they use it for experiences like travel rather than material goods.

The Journal looked at a study by Ryan Howell, an associate professor of psychology at San Francisco State University, which found that people may at first think material purchases offer better value for their money because they’re tangible and they last longer, while experiences are fleeting.  But Howell found that when people looked back at their purchases, they realized that experiences actually provided better value.  We even get more pleasure out of anticipating experiences than we do from anticipating the acquisition of material things.

Another psychology professor, Thomas Gilovich at Cornell, reached similar conclusions.  He found that people make a rational calculation:  “I can either go there, or I can have this.  Going there may be great, but it’ll be over fast.  But if I buy something, I’ll always have it.”  According to Gilovich, that’s factually true, but not psychologically true, because we “adapt to our material goods.”

We “adapt” to our material goods?  How?  Psychologists like Gilovich talk about “hedonic adaptation.”  Buying a new coat or a new car may provide a brief thrill, but we soon come to take it for granted.  Experiences, on the other hand, meet more of our “underlying psychological needs.”

Why?  Because they’re often shared with others, giving us a greater sense of connection, and they form a bigger part of our sense of identity.  You also don’t feel that you’re trying to keep up with the Joneses quite so much.  While it may bother you when you compare your material things to others’ things, comparing your vacation to someone else’s won’t bug you as much because “you still have your own experiences and your own memories.”

Another article in the Journal, published in 2015, focused on the findings of economists rather than psychologists.  A group of economists like John Helliwell, a professor at the University of British Columbia, concluded that happiness—overall well-being–should not be measured by how much money we have by using metrics like per-capita income and gross domestic product (GDP).  “GDP is not even a very good measure of economic well-being,” he said.

Instead, the World Happiness Report, which Helliwell co-authored, ranked countries based on how people viewed the quality of their lives. It noted that six factors account for 75 percent of the differences between countries.  The six factors:  GDP, life expectancy, generosity, social support, freedom, and corruption.  Although GDP and life expectancy relate directly to income, the other four factors reflect a sense of security, trust, and autonomy.  So although the U.S. ranked first in overall GDP, it ranked only 15th in happiness because it was weaker in the other five variables.

According to Jeffrey D. Sachs, a professor at Columbia and co-author of the World Happiness Report, incomes in the U.S. have risen, but the country’s sense of “social cohesion” has declined.  The biggest factor contributing to this result is “distrust.”  Although the U.S. is very rich, we’re not getting the benefits of all this affluence.

If you ask people whether they can trust other people, Sachs said, “the American answer has been in significant decline.”   Forward to 2017.  Today, when many of our political leaders shamelessly lie to us, our trust in others has no doubt eroded even further.

Even life expectancy is going downhill in the U.S.  According to the AP, U.S. life expectancy was on the upswing for decades, but 2016 marked the first time in more than a half-century that it fell in two consecutive years.

Let’s return to our original question:  whether money can buy happiness.  The most recent research I’ve come across is a study done at Harvard Business School, noted in the November-December 2017 issue of Harvard Magazine.  Led by assistant professor of business administration Ashley Whillans, it found that, in developed countries, people who trade money for time—by choosing to live closer to work, or to hire a housecleaner, for example–are happier. This was true across the socioeconomic spectrum.

According to Whillans, extensive research elsewhere has confirmed the positive emotional effects of taking vacations and going to the movies.  But the Harvard researchers wanted to explore a new ideawhether buying ourselves out of negative experiences was another pathway to happiness.

Guess what:  it was.  One thing researchers focused on was “time stress” and how it affects happiness.  They knew that higher-earners feel that every hour of their time is financially valuable.  Like most things viewed as valuable, time is also perceived as scarce, and that scarcity translates into time stress, which can easily contribute to unhappiness.

The Harvard team surveyed U.S., Canadian, Danish, and Dutch residents, ranging from those who earned $30,000 a year to middle-class earners and millionaires. Canadian participants were given a sum of money—half to spend on a service that would save one to two hours, and half to spend on a material purchase like clothing or jewelry.  Participants who made a time-saving purchase (like buying take-out food) were more likely to report positive feelings, and less likely to report feelings of time stress, than they did after their shopping sprees.

Whillans noted that in both Canada and the U.S., where busyness is “often flaunted as a status symbol,” opting for outsourcing jobs like cooking and cleaning can be culturally challenging.  Why?  Because people like to pretend they can do it all.  Women in particular find themselves stuck in this situation.  They have more educational opportunities and are likely to be making more money and holding more high-powered jobs, but their happiness is not increasing commensurately.

The Harvard team wants to explore this in the future.  According to Whillans, the initial evidence shows that among couples who buy time, “both men and women feel less pulled between the demands of work and home life,” and that has a positive effect on their relationship.  She hopes that her research will ameliorate some of the guilt both women and men may feel about paying a housekeeper or hiring someone to mow the law—or ordering Chinese take-out on Thursday nights.

Gee, Ashley, I’ve never felt guilty about doing any of that.  Maybe that’s one reason why I’m a pretty happy person.

How about you?

Whatever your answer may be, I’ll join the throng and wish you HAPPY HOLIDAYS!

 

 

 

 

 

The Last Straw(s)

A crusade against plastic drinking straws?  Huh?

At first glance, it may strike you as frivolous.  But it’s not.  In fact, it’s pretty darned serious.

In California, the city of Berkeley may kick off such a crusade.   In June, the city council directed its staff to research what would be California’s first city ordinance prohibiting the use of plastic drinking straws in bars, restaurants, and coffee shops.

Berkeley is responding to efforts by nonprofit groups like the Surfrider Foundation that want to eliminate a significant source of pollution in our oceans, lakes, and other bodies of water. According to the conservation group Save the Bay, the annual cleanup days held on California beaches have found that plastic straws and stirrers are the sixth most common kind of litter.  If they’re on our beaches, they’re flowing into the San Francisco Bay, into the Pacific Ocean, and ultimately into oceans all over the world.

As City Councilwoman Sophie Hahn, a co-author of the proposal to study the ban, has noted, “They are not biodegradable, and there are alternatives.”

I’ve been told that plastic straws aren’t recyclable, either.  So whenever I find myself using a plastic straw to slurp my drink, I conscientiously separate my waste:  my can of Coke Zero goes into the recycling bin; my plastic straw goes into the landfill bin.  This is nuts.  Banning plastic straws in favor of paper ones is the answer.

Realistically, it may be a tough fight to ban plastic straws because business interests (like the Monster Straw Co. in Laguna Beach) want to keep making and selling them.  And business owners claim that they’re more cost-effective, leading customers to prefer them.  As Monster’s founder and owner, Natalie Buketov, told the SF Chronicle, “right now the public wants cheap plastic straws.”

Berkeley could vote on a ban by early 2018.

On the restaurant front, some chefs would like to see the end of plastic straws.  Spearheading a growing movement to steer eateries away from serving straws is Marcel Vigneron, owner-chef of Wolf Restaurant on Melrose Avenue in L.A.  Vigneron, who’s appeared on TV’s “Top Chef” and “Iron Chef,” is also an enthusiastic surfer, and he’s seen the impact of straw-pollution on the beaches and marine wildlife.  He likes the moniker “Straws Suck” to promote his effort to move away from straws, especially the play on words:  “You actually use straws to suck, and they suck because they pollute the oceans,” he told CBS in July.

Vigneron added that if a customer wants a straw, his restaurant has them.  But servers ask customers whether they want a straw instead of automatically putting them into customers’ drinks.  He notes that every day, 500 million straws are used in the U.S., and they could “fill up 127 school buses.”  He wants to change all that.

Drinking straws have a long history.  Their origins were apparently actual straw, or other straw-like grasses and plants.  The first paper straw, made from paper coated with paraffin wax, was patented in 1888 by Marvin Stone, who didn’t like the flavor of a rye grass straw added to his mint julep.  The “bendy” paper straw was patented in 1937.  But the plastic straw took off, along with many other plastic innovations, in the 1960s, and nowadays they’re difficult to avoid.

Campaigns like Surfrider’s have taken off because of mounting concern with plastic pollution.  Surfrider, which has also campaigned against other threats to our oceans, like plastic bags and cigarette butts, supports the “Straws Suck” effort, and according to author David Suzuki, Bacardi has joined with Surfrider in the movement to ban plastic straws.

Our neighbors to the north have already leaped ahead of California.  The town of Tofino in British Columbia claims that it mounted the very first “Straws Suck” campaign in 2016.  By Earth Day in April that year, almost every local business had banned plastic straws.  A fascinating story describing this effort appeared in the Vancouver Sun on April 22, 2016.

All of us in the U.S., indeed the world, need to pay attention to what plastic is doing to our environment.  “At the current rate, we are really headed toward a plastic planet,” according to the author of a study reported in the journal Science Advances, reported by AP in July.  Roland Geyer, an industrial ecologist at UC Santa Barbara, noted that there’s enough discarded plastic to bury Manhattan under more than 2 miles of trash.

Geyer used the plastics industry’s own data to find that the amount of plastics made and thrown out is accelerating.  In 2015, the world created more than twice as much as it made in 1998.

The plastics industry has fought back, relying on the standard of cost-effectiveness.  It claims that alternatives to plastic, like glass, paper, or aluminum, would require more energy to produce.  But even if that’s true, the energy difference in the case of items like drinking straws would probably be minimal.  If we substitute paper straws for plastic ones, the cost difference would likely be negligible, while the difference for our environment—eliminating all those plastic straws floating around in our waterways–could be significant.

Aside from city bans and eco-conscious restaurateurs, we need to challenge entities like Starbucks.  The mega-coffee-company and coffeehouse-chain prominently offers, even flaunts, brightly-colored plastic straws for customers sipping its cold drinks.  What’s worse:  they happily sell them to others!  Just check out the Starbucks straws for sale on Amazon.com.  Knowing what we know about plastic pollution, I think Starbucks’s choice to further pollute our environment by selling its plastic straws on the Internet is unforgivable.

At the end of the day, isn’t this really the last straw?

 

Exploring the Universe with Two Young Muggles

Last week, I happily accompanied two young Muggles as we explored the universe together.

The universe?  Universal Studios in Hollywood, California, plus a few other nearby spots.

The young Muggles?  My astonishing granddaughters, both great fans of the series of Harry Potter (HP) books written by J.K. Rowling and the films based on them.  Eleven-year-old Beth has read all of the books at least twice, and nine-year-old Shannon has seen most of the movies.  Four of us grown-up Muggles came along, all conversant with HP except for me. (I’ve seen only the first film.)  According to Rowling, Muggles are people who lack any magical ability and aren’t born in a magical family.  I.e., people like us.

For me, our trip down the coast of California was an exhilarating escape from the concerns assaulting me at home:  dental issues, efforts to get my third novel published, and—of course—the current political scene.  We landed at the very edge of the continent, staying at a newly renovated hotel on Ocean Avenue in Santa Monica, where we literally faced the ocean and walked alongside it every day.

Bookending our fun-filled encounter with Universal Studios were visits to two great art museums.  Coming from San Francisco, a city inhabited by our own array of wonderful art museums and galleries, we didn’t expect to be exceedingly impressed by the museums offered in L.A.  But we were.

On Presidents’ Day, we headed to LACMA, the Los Angeles County Museum of Art, where a long, long entry line stretched as far as Wilshire Boulevard.  Because of atypically overcast skies on a school/work holiday?  Not entirely.  Admission was free that day (thanks, Target), so lots of folks showed up in search of fee-less exposure to outstanding works of art.

We viewed a lot of excellent art, but when our feet began to ache, we piled back into our rented minivan and went a little way down the road (Fairfax Avenue) to the Original Farmers’ Market.  Sampling food and drink in a farmers’ market dating back to 1934 was great fun.  We also took a quick look at The Grove, an upscale mall adjacent to the F.M., buying a book at Barnes and Noble before heading back to Santa Monica for the evening.

The next day was devoted to Universal Studios, where our first destination was The Wizarding World of Harry Potter.  Here I would at last explore the universe with two young Muggles.  We walked through other Universal attractions, but they didn’t tempt us…not just yet.  The lure of Harry Potter and friends took precedence.

We’d been advised that a must for first-timers was a ride called Harry Potter and the Forbidden Journey, so we decided to do that first.  As we approached the ride, we saw Muggles like us everywhere, including swarms of young people garbed in Hogwarts robes and other gear (all for sale at the shops, of course).  As we waited in line for the ride, we entered a castle (constructed to look like Hogwarts), where we were greeted by colorful talking portraits of HP characters hanging on the walls.

Warnings about the ride were ubiquitous.  It would be jarring, unsuitable for those prone to dizziness or motion sickness, and so forth and so on, ad nauseum.  As someone who’s worked as a lawyer, I knew precisely why these warnings were posted.  Universal Studios was trying to avoid any and all legal liability for complaints from ride-goers.

I decided to ignore the warnings and hopped on a fast-moving chair built for 3 people.  I was bumped around a bit against the chair’s hard surfaces, and I closed my eyes during some of the most startling 3-D effects, but I emerged from the ride in one piece and none the worse for wear.  Nine-year-old Shannon, however, was sobbing when we all left the ride together.  Even sitting next to her super-comforting dad hadn’t shielded her from the scariest special effects.

After the ride, we strolled around The Wizarding World, sampling sickeningly sweet Butterbeer, listening to the Frog Choir, and checking out the merchandise at shops like Gladrags Wizardwear and Ollivanders.  Olllivanders featured magic wands by “Makers of Fine Wands since 382 B.C.”  (Prices began at $40 for something that was essentially a wooden stick.)

Overall, we had a splendid time with HP and friends.  But now it was finally time to explore things non-HP.  Our first priority was the Studio Tour.  We piled into trams that set out on a tour of the four-acre backlot of the world’s largest working studio, where movies and TV shows are still filmed every day.  We got a chance to view the Bates Motel (including a live actor portraying creepy Norman Bates), a pretty realistic earthquake, a virtual flood, a plane-crash scene from The War of the Worlds, and two things I could have done without.  One featured King Kong in 3-D (the new Kong movie being heavily promoted at Universal); the other offered 3-D scenes from The Fast and the Furious films—not my cup of tea.  But overall it was a great tour for movie buffs like us.

After the tour, we headed for the fictional town of Springfield, home of the Simpsons family, stars of The Simpsons TV comedy program as well as their own film.  Soon we were surrounded by many of the hilarious Simpsons locations, including the Kwik-E-Mart, Moe’s Tavern, the Duff Brewery Beer Garden, and a sandwich shop featuring the Krusty Burger and the Sideshow Bob Footlong.  Characters like Krusty the Clown, Sideshow Bob, and the Simpsons themselves wandered all around Springfield, providing great fodder for photos.  For anyone who’s ever watched and laughed at The Simpsons, this part of Universal is tons of fun.

The Simpsons ride was terrific, too.  Once again, lots of warnings, lots of getting bumped around, and lots of 3-D effects, but it was worth it.  Maybe because I’ve always liked The Simpsons, even though I’ve hardly watched the TV show in years.

Other notable characters and rides at Universal include the Minions (from the Despicable Me films), Transformers, Jurassic Park, and Shrek.  Some of us sought out a couple of these, but I was happy to take a break, sit on a nearby bench, munch on popcorn, and sip a vanilla milkshake.

When the 6 p.m. closing time loomed, we had to take off.  Once more, we piled into the minivan and headed for an evening together in Santa Monica.  This time we all took in the Lego Batman movie.  I think I missed seeing some of it because, after a long day of exploring the universe, I fell asleep.

On the last day of our trip, we drove to the Getty Center, the lavish art museum located on a hill in Brentwood very close to the place where I got married decades ago.  Thanks to J. Paul Getty, who not only made a fortune in the oil industry but also liked to collect art, the Center features a large permanent collection as well as impressive changing exhibitions.

The six of us wandered through the museum’s five separate buildings, admiring the fabulous art as well as the stunning architecture.  We also lingered outside, relishing the gorgeous views and the brilliant sunshine that had been largely absent since our arrival in LA.  A bite to eat in the crowded café, a short trip to the museum store, and we six Muggles of various ages were off to Santa Monica one last time before driving home to San Francisco.

By the way, at the museum store you can buy a magnet featuring J. Paul Getty’s recipe for success:  “1. Rise early.  2. Work hard.  3. Strike oil.”  It certainly worked for him!

 

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]

 

Link

Looking Back…The Election of 1984 (Part II)

I wrote Part I of this blog post in late 1984.  In Part I, I commented on the campaign for president and vice president that had occurred that fall.

Part II, also written in 1984, offered my thoughts at the time about what might take place post-1984.

During the past 32 years, we’ve seen another major political party nominate a woman to be vice president.  In my view, the selection of Sarah Palin as that candidate in 2008 was John McCain’s replication of Walter Mondale’s unhappy selection of Geraldine Ferraro.  It was perhaps even more detrimental to McCain because he probably had a better chance of being elected president than Mondale had in 1984. Palin was even more untested as a political figure than Ferraro, having served only as a suburban mayor and a recently elected governor of a small state.  She soon demonstrated her lack of experience and knowledge of national issues, making her a genuine liability for McCain, who lost the support of many voters who might have otherwise been inclined to vote for him.

In 2016, American voters finally have the opportunity to select a woman as their president.  This time she’s a woman with a great deal of experience in public life and vast knowledge of the issues confronting our nation.  Although, as a candidate, Hillary Clinton hasn’t inspired unbridled enthusiasm, she’s as close to a “woman candidate of national stature” (to use my own words) as we’ve ever had.  In 1984, I predicted that a “woman candidate of national stature” whose position “represents the majority thinking in this country” would be “a realistic candidate,…and she will win.”

Was I right?

Here’s exactly what I wrote in 1984:

 

PART II

How does this leave things for the future?  Putting aside the personal future of Geraldine Ferraro, which is probably bright, what about other women candidates?  And what about the possibility of any woman being nominated and elected to the presidency or vice presidency of this country?  The Mondale-Ferraro defeat should not and must not be read as a defeat for women candidates in general.  Ferraro’s assets, both as a candidate and as a human being, are considerable, but, to be honest, she joined the campaign largely unknown and untested.
Another woman candidate might well fare otherwise.

Twenty-five years ago [i.e., in 1959], Margaret Chase Smith, a well-known and respected Republican U.S. Senator from Maine, announced her candidacy for the presidency.  She never had a realistic shot at it in that benighted era, but she might have had one in the 1980s.  She had established herself through a number of terms in the House of Representatives and the Senate, had climbed up the ladder in the Senate to committee chairmanships, and had become a recognized and admired figure on the national political scene.  A woman presenting similar credentials in the 1980s would bring a credibility to a national ticket that Ferraro, as a relative newcomer to the political arena, could not.  For this reason it’s important that women continue to run for political office on the state and local level, building political careers that will lead to the White House after they have achieved national stature—not before.

In all of the fuss made over Ferraro’s candidacy, something important was forgotten.  It’s not desirable for any political party to nominate a candidate solely or even primarily because that candidate is a woman or a black or a Hispanic—or a white Anglo male, for that matter.  The selection process must be based on the totality of what any given candidate will bring to the office.  The Democrats were wrong to select a woman candidate largely because she was a woman (those who said that a man with Ferraro’s credentials would never have been considered were—however painful it is to admit—correct).  They were wrong because Americans do not, and should not, vote for “symbols.”  When it became clear that Jesse Jackson wasn’t a candidate with a broad-based constituency but had become a “black” candidate and nothing more, that was the death knell for any realistic chance he had of winning the nomination.  But saying that is not saying that no black candidate can ever win.

Women candidates and candidates who are members of minority groups have run for office and won broad-based electoral support where they have been viewed as representing the best interests of a majority of the electorate.  But women and others who are viewed as “symbols,” representing only that segment of the electorate from which they came, will never win that sort of broad-based support.  On the contrary, their candidacies may serve only to polarize voters, leading to strife and bitterness among the electorate, and probable if not certain defeat at the ballot box.

When Mondale chose Ferraro, he already had the votes of the politically aware women for whom Ferraro became a symbol by virtue of his position on such issues as the ERA [the Equal Rights Amendment] and [the issue of] comparable worth.  He would not have lost the votes of those women no matter what else he did.  Likewise, Reagan didn’t have the votes of those women and wouldn’t have had them no matter what he did.  Even in the unimaginable event that Reagan had selected a woman running-mate, she would have had to be a woman whose thinking was compatible with his, and if she had endorsed Reagan’s views on the ERA (á la Phyllis Schlafly), feminists wouldn’t have been any more likely to vote for Reagan-Schlafly than Reagan-Bush.  It shouldn’t therefore be terribly difficult to understand why women who were otherwise happy with Reagan weren’t inclined to switch to Mondale simply because of Ferraro.

In sum, women voters are really not very different from men voters, and Democratic strategists who thought otherwise were proved wrong in 1984.  Women vote their interests, and these do not necessarily coincide with what is popularly perceived as “women’s” interests.  Women, like men, are concerned about the economy, our country’s status in the world, and a host of other matters along with the particular concerns they may have as women.

When a woman candidate of national stature emerges whose position on these interests represents the majority thinking in the country, she will be a realistic candidate for the vice presidency or the presidency, and she will win.

Hamilton, Hamilton…Who Was He Anyway?

Broadway megahit “Hamilton” has brought the Founding Parent (okay, Founding Father) into a spotlight unknown since his own era.

Let’s face it.  The Ron Chernow biography, turned into a smash Broadway musical by Lin-Manuel Miranda, has made Alexander Hamilton into the icon he hasn’t been–or maybe never was–in a century or two. Just this week, the hip-hop musical “Hamilton” received a record-breaking 16 Tony Award nominations.

His new-found celebrity has even influenced his modern-day successor, current Treasury Secretary Jack Lew, leading Lew to reverse his earlier plan to remove Hamilton from the $10 bill and replace him with the image of an American woman.

Instead, Hamilton will remain on the front of that bill, with a group representing suffragette leaders in 1913 appearing on the back, while Harriet Tubman will replace no-longer-revered and now-reviled President Andrew Jackson on the front of the $20 bill.  We’ll see other changes to our paper currency during the next five years.

But an intriguing question remains:  How many Americans—putting aside those caught up in the frenzy on Broadway, where theatergoers are forking over $300 and $400 to see “Hamilton” on stage—know who Hamilton really was?

A recent study done by memory researchers at Washington University in St. Louis has confirmed that most Americans are confident that Hamilton was once president of the United States.

According to Henry L. Roediger III, a human memory expert at Wash U, “Our findings from a recent survey suggest that about 71 percent of Americans are fairly certain that [Hamilton] is among our nation’s past presidents.  I had predicted that Benjamin Franklin would be the person most falsely recognized as a president, but Hamilton beat him by a mile.”

Roediger (whose official academic title is the James S. McDonnell Distinguished University Professor in Arts & Sciences) has been testing undergrad college students since 1973, when he first administered a test while he was himself a psychology grad student at Yale. His 2014 study, published in the journal Science, suggested that we as a nation do fairly well at naming the first few and the last few presidents.  But less than 20 percent can remember more than the last 8 or 9 presidents in order.

Roediger’s more recent study is a bit different because its goal was to gauge how well Americans simply recognize the names of past presidents.  Name-recognition should be much less difficult than recalling names from memory and listing them on a blank sheet of paper, which was the challenge in 2014.

The 2016 study, published in February in the journal Psychological Science, asked participants to identify past presidents, using a list of names that included actual presidents as well as famous non-presidents like Hamilton and Franklin.  Other familiar names from U.S. history, and non-famous but common names, were also included.

Participants were asked to indicate their level of certainty on a scale from zero to 100, where 100 was absolutely certain.

What happened?  The rate for correctly recognizing the names of past presidents was 88 percent overall, although laggards Franklin Pierce and Chester Arthur rated less than 60 percent.

Hamilton was more frequently identified as president (with 71 percent thinking that he was) than several actual presidents, and people were very confident (83 on the 100-point scale) that he had been president.

More than a quarter of the participants incorrectly recognized others, notably Franklin, Hubert Humphrey, and John Calhoun, as past presidents.  Roediger thinks that probably happened because people are aware that these were important figures in American history without really knowing what their actual roles were.

Roediger and his co-author, K. Andrew DeSoto, suggest that our ability to recognize the names of famous people hinges on their names appearing in a context related to the source of their fame.  “Elvis Presley was famous, but he would never be recognized as a past president,” Roediger says.   It’s not enough to have a familiar name.  It must be “a familiar name in the right context.”

This study is part of an emerging line of research focusing on how people remember history.  The recent studies reveal that the ability to remember the names of presidents follows consistent and reliable patterns.  “No matter how we test it—in the same experiment, with different people, across generations, in the laboratory, with online studies, with different types of tests—there are clear patterns in how the presidents are remembered and how they are forgotten,” DeSoto says.

While decades-old theories about memory can explain the results to some extent, these findings are sparking new ideas about fame and just how human memory-function treats those who achieve it.

As Roediger notes, “knowledge of American presidents is imperfect….”  False fame can arise from “contextual familiarity.”  And “even the most famous person in America may be forgotten in as short a time as 50 years.”

So…how will Alexander Hamilton’s new-found celebrity hold up?  Judging from the astounding success of the hip-hop musical focusing on him and his cohorts, one can predict with some confidence that his memory will endure far longer than it otherwise might have.

This time, he may even be remembered as our first Secretary of the Treasury, not as the president he never was.