Category Archives: psychological research

Eating Dessert Can Help You Eat Better? Seriously?

I just celebrated my birthday with a scrumptious meal at a charming San Francisco restaurant. Sharing a fabulous candle-topped dessert with my companion was a slam-dunk way to end a perfect meal in a splendid restaurant.

Should I regret consuming that delicious dessert?

The answer, happily, is no.  I should have no regrets about eating my birthday surprise, and a recent study backs me up.

According to this study, published in the Journal of Experimental Psychology: Applied and reported in a recent issue of TIME magazine, having an occasional dessert may actually be a useful tool to help you eat better.

Here’s what happened:  More than 130 university students and staff were offered a choice of two desserts and asked to make their choice at the start of the lunch line in a campus cafeteria.  The study found that those who made the “decadent” selection—lemon cheesecake—chose healthier meals and consumed fewer calories overall than those who picked fresh fruit.  Simply selecting it first was enough to influence the rest of their order.

Almost 70 percent of those who picked the cheesecake went on to choose a healthier main dish and side dish, while only about a third of those selecting fruit made the healthier choice.  The cheesecake-choosers also ate about 250 fewer total calories during their meal compared with the fruit-choosers.

Study co-author Martin Reimann, an assistant professor of marketing and cognitive science at the University of Arizona, concluded that choosing something healthy first can give us a “license” to choose something less healthy later.  But if you turn that notion around and choose something more “decadent” early on, “then this license [to choose high-calorie food] has already expired.”  In other words, making a calorie-laden choice at the beginning of the meal seems to steer people toward healthier choices later.

No one is suggesting that we all indulge in dessert on an everyday basis.  For many of us, the pursuit of good health leads us to avoid sugary desserts and choose fresh fruit instead.  But Reimann believes that choosing dessert strategically can pay off.  He advises us to be “mindful and conscious about the different choices you make.”

Will I order lemon cheesecake, a chocolate brownie, or a spectacular ice-cream concoction for dessert at my next meal?  Probably not.  But I am going to keep the Arizona research in mind.

You should, too.  Beginning your meal with the knowledge that it could end with a calorie-laden dessert just might prompt you to select a super-healthy salad for your entrée, adding crunchy green veggies on the side.

 

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Remembering Stuff

Are you able to remember stuff pretty well?  If you learned that stuff quickly, you have a very good chance of retaining it.  Even if you spent less time studying it than you might have.

These conclusions arise from a new study by psychologists at Washington University in St. Louis.   According to its lead author, Christopher L. Zerr, “Quicker learning appears to be more durable learning.”

The study, published in the journal Psychological Science, tried a different way to gauge differences in how quickly and well people learn and retain information.  Using word-pairs that paired English with a difficult-to-learn language, Lithuanian, the researchers created a “learning-efficiency score” for each participant.

“In each case, initial learning speed proved to be a strong predictor of long-term retention,” said senior author Kathleen B. McDermott, professor of psychological and brain sciences at Washington University.

46 of the participants returned for a follow-up study three years later.  The results confirmed the earlier study’s results.

What explains this outcome?  The researchers suggest two possibilities.

First, individuals may differ because those with better attention-control can be more effective while learning material, thus avoiding distraction and forgetting.  Another explanation:  efficient learners use more effective learning strategies, like using a key word to relate two words in a pair.

The researchers don’t think their job is done.  Instead, they’d like to see future research on learning efficiency that would have an impact in educational and clinical settings.

The goal is to be able to teach students how to be efficient learners, and to forestall the effects of disease, aging, and neuropsychological disorders on learning and retention.

Conclusion:  If you’ve always been a quick learner, that’s probably stood you in good stead, enabling you to remember stuff you learned quickly in the first place.

 

[This blog post is not the one I originally intended to write this month, when I planned to focus on how important it is to vote in the midterm elections in November.  Publishing my new novel, RED DIANA, this month has kept me from writing that post, but I hope to publish it at some point.  It would be something of a reprise of a post I published in September 2014, “What Women Need to Do.”]

Happy Holidays! Well, maybe…

 

As the greeting “Happy Holidays” hits your ears over and over during the holiday season, doesn’t it raise a question or two?

At a time when greed and acquisitiveness appear to be boundless, at least among certain segments of the American population, the most relevant questions seem to be:

  • Does money buy happiness?
  • If not, what does?

These questions have been the subject of countless studies.  Let’s review a few of the answers they’ve come up with.

To begin, exactly what is it that makes us “happy”?

A couple of articles published in the past two years in The Wall Street Journal—a publication certainly focused on the acquisition of money—summarized some results.

Wealth alone doesn’t guarantee a good life.  According to the Journal, what matters a lot more than a big income is how people spend it.  For instance, giving money away makes people much happier than spending it on themselves.  But when they do spend it on themselves, they’re a lot happier when they use it for experiences like travel rather than material goods.

The Journal looked at a study by Ryan Howell, an associate professor of psychology at San Francisco State University, which found that people may at first think material purchases offer better value for their money because they’re tangible and they last longer, while experiences are fleeting.  But Howell found that when people looked back at their purchases, they realized that experiences actually provided better value.  We even get more pleasure out of anticipating experiences than we do from anticipating the acquisition of material things.

Another psychology professor, Thomas Gilovich at Cornell, reached similar conclusions.  He found that people make a rational calculation:  “I can either go there, or I can have this.  Going there may be great, but it’ll be over fast.  But if I buy something, I’ll always have it.”  According to Gilovich, that’s factually true, but not psychologically true, because we “adapt to our material goods.”

We “adapt” to our material goods?  How?  Psychologists like Gilovich talk about “hedonic adaptation.”  Buying a new coat or a new car may provide a brief thrill, but we soon come to take it for granted.  Experiences, on the other hand, meet more of our “underlying psychological needs.”

Why?  Because they’re often shared with others, giving us a greater sense of connection, and they form a bigger part of our sense of identity.  You also don’t feel that you’re trying to keep up with the Joneses quite so much.  While it may bother you when you compare your material things to others’ things, comparing your vacation to someone else’s won’t bug you as much because “you still have your own experiences and your own memories.”

Another article in the Journal, published in 2015, focused on the findings of economists rather than psychologists.  A group of economists like John Helliwell, a professor at the University of British Columbia, concluded that happiness—overall well-being–should not be measured by how much money we have by using metrics like per-capita income and gross domestic product (GDP).  “GDP is not even a very good measure of economic well-being,” he said.

Instead, the World Happiness Report, which Helliwell co-authored, ranked countries based on how people viewed the quality of their lives. It noted that six factors account for 75 percent of the differences between countries.  The six factors:  GDP, life expectancy, generosity, social support, freedom, and corruption.  Although GDP and life expectancy relate directly to income, the other four factors reflect a sense of security, trust, and autonomy.  So although the U.S. ranked first in overall GDP, it ranked only 15th in happiness because it was weaker in the other five variables.

According to Jeffrey D. Sachs, a professor at Columbia and co-author of the World Happiness Report, incomes in the U.S. have risen, but the country’s sense of “social cohesion” has declined.  The biggest factor contributing to this result is “distrust.”  Although the U.S. is very rich, we’re not getting the benefits of all this affluence.

If you ask people whether they can trust other people, Sachs said, “the American answer has been in significant decline.”   Forward to 2017.  Today, when many of our political leaders shamelessly lie to us, our trust in others has no doubt eroded even further.

Even life expectancy is going downhill in the U.S.  According to the AP, U.S. life expectancy was on the upswing for decades, but 2016 marked the first time in more than a half-century that it fell in two consecutive years.

Let’s return to our original question:  whether money can buy happiness.  The most recent research I’ve come across is a study done at Harvard Business School, noted in the November-December 2017 issue of Harvard Magazine.  Led by assistant professor of business administration Ashley Whillans, it found that, in developed countries, people who trade money for time—by choosing to live closer to work, or to hire a housecleaner, for example–are happier. This was true across the socioeconomic spectrum.

According to Whillans, extensive research elsewhere has confirmed the positive emotional effects of taking vacations and going to the movies.  But the Harvard researchers wanted to explore a new ideawhether buying ourselves out of negative experiences was another pathway to happiness.

Guess what:  it was.  One thing researchers focused on was “time stress” and how it affects happiness.  They knew that higher-earners feel that every hour of their time is financially valuable.  Like most things viewed as valuable, time is also perceived as scarce, and that scarcity translates into time stress, which can easily contribute to unhappiness.

The Harvard team surveyed U.S., Canadian, Danish, and Dutch residents, ranging from those who earned $30,000 a year to middle-class earners and millionaires. Canadian participants were given a sum of money—half to spend on a service that would save one to two hours, and half to spend on a material purchase like clothing or jewelry.  Participants who made a time-saving purchase (like buying take-out food) were more likely to report positive feelings, and less likely to report feelings of time stress, than they did after their shopping sprees.

Whillans noted that in both Canada and the U.S., where busyness is “often flaunted as a status symbol,” opting for outsourcing jobs like cooking and cleaning can be culturally challenging.  Why?  Because people like to pretend they can do it all.  Women in particular find themselves stuck in this situation.  They have more educational opportunities and are likely to be making more money and holding more high-powered jobs, but their happiness is not increasing commensurately.

The Harvard team wants to explore this in the future.  According to Whillans, the initial evidence shows that among couples who buy time, “both men and women feel less pulled between the demands of work and home life,” and that has a positive effect on their relationship.  She hopes that her research will ameliorate some of the guilt both women and men may feel about paying a housekeeper or hiring someone to mow the law—or ordering Chinese take-out on Thursday nights.

Gee, Ashley, I’ve never felt guilty about doing any of that.  Maybe that’s one reason why I’m a pretty happy person.

How about you?

Whatever your answer may be, I’ll join the throng and wish you HAPPY HOLIDAYS!

 

 

 

 

 

The Summer of Love and Other Random Thoughts

  1.  The CEO pay ratio is now 271-to-1.

 According to the Economic Policy Institute’s annual report on executive compensation, released on July 20, chief executives of America’s 350 largest companies made an average of $15.6 million in 2016, or 271 times more than what the typical worker made last year.

The number was slightly lower than it was in 2015, when the average pay was $16.3 million, and the ratio was 286-to-1.   And it was even lower than the highest ratio calculated, 376-to-1 in 2000.

But before we pop any champagne corks because of the slightly lower number, let’s recall that in 1989, after eight years of Ronald Reagan in the White House, the ratio was 59-to-1, and in 1965, in the midst of the Vietnam War and civil rights turmoil, it was 20-to-1.

Let’s reflect on those numbers for a moment.  Just think about how distorted these ratios are and what they say about our country.

Did somebody say “income inequality”?

[This report appeared in the San Francisco Chronicle on July 21, 2017.]

 

  1. Smiling

 I’ve written in this blog, at least once before, about the positive results of smiling.  [Please see “If You’re Getting Older, You May Be Getting Nicer,” published on May 30, 2014.]

But I can’t resist adding one more item about smiling.  In a story in The Wall Street Journal in June, a cardiologist named Dr. John Day wrote about a woman, aged 107, whom he met in the small city of Bapan, China.  Bapan is known as “Longevity Village” because so many of its people are centenarians (one for every 100 who live there; the average in the U.S. is one in 5,780).

Day asked the 107-year-old woman how she reached her advanced age.  Noting that she was always smiling, he asked if she smiled even through the hard times in her life.  She replied, “Those are the times in which smiling is most important, don’t you agree?”

Day added the results of a study published in Psychological Science in 2010.  It showed that baseball players who smiled in their playing-card photographs lived seven years longer, on average, than those who looked stern.

So, he wrote, “The next time you’re standing in front of a mirror, grin at yourself.  Then make that a habit.”

[Dr. Day’s article appeared in The Wall Street Journal on June 24-25, 2017.]

 

  1. The Summer of Love

This summer, San Francisco is awash in celebrations of the “Summer of Love,” the name attached to the city’s summer of 1967.   Fifty years later, the SF Symphony, the SF Jazz Center, a bunch of local theaters, even the Conservatory of Flowers in Golden Gate Park, have all presented their own take on it.

Most notably, “The Summer of Love Experience,” an exhibit at the de Young Museum in Golden Gate Park, is a vivid display of the music, artwork, and fashions that popped up in San Francisco that summer.

As a happy denizen of San Francisco for the past 12 years, I showed up at the de Young to see the exhibit for myself.

My favorite part of the exhibit was the sometimes outrageous fashions artfully displayed on an array of mannequins.  Not surprisingly, they included a healthy representation of denim.  Some items were even donated by the Levi’s archives in San Francisco.  [Please see the reference to Levi’s in my post, “They’re My Blue Jeans and I’ll Wear Them If I Want To,” published in May.]

Other fashions featured colorful beads, crochet, appliqué, and embroidery, often on silk, velvet, leather, and suede.  Maybe it was my favorite part of the exhibit because I’ve donated clothing from the same era to the Chicago History Museum, although my own clothing choices back then were considerably different.

Other highlights in the exhibit were perfectly preserved psychedelic posters featuring rock groups like The Grateful Dead, The Doors, and Moby Grape, along with record album covers and many photographs taken in San Francisco during the summer of 1967.  Joan Baez made an appearance as well, notably with her two sisters in a prominently displayed anti-Vietnam War poster.  Rock and roll music of the time is the constant background music for the entire exhibit.

In 1967, I may have been vaguely aware of San Francisco’s Summer of Love, but I was totally removed from it.  I’d just graduated from law school, and back in Chicago, I was immersed in studying for the Illinois bar exam.  I’d also begun to show up in the chambers of Judge Julius J. Hoffman, the federal district judge for whom I’d be a law clerk for the next two years.  [Judge Hoffman will be the subject of a future post or two.]

So although the whole country was hearing news stories about the antics of the thousands of hippies who flocked to Haight-Ashbury and Golden Gate Park in San Francisco, my focus was on my life in Chicago, with minimal interest in what was happening 2000 miles away.  For that reason, much of the exhibit at the de Young was brand-new to me.

The curators of the exhibit clearly chose to emphasize the creativity of the art, fashion, and music of the time.  At the same time, the exhibit largely ignores the downside of the Summer of Love—the widespread use of drugs, the unpleasant changes that took place in the quiet neighborhood around Haight-Ashbury, the problems created by the hordes of young people who filled Golden Gate Park.

But I was glad I saw it–twice.

You may decide to come to San Francisco to see this exhibit for yourself.

If you do, please don’t forget:  “If you’re going to San Francisco, be sure to wear some flowers in your hair.”

 

 

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Random Thoughts

On truthfulness

Does it bother you when someone lies to you?  It bothers me.  And I just learned astonishing new information about people who repeatedly tell lies.

According to British neuroscientists, brain scans of the amygdala—the area in the brain that responds to unpleasant emotional experiences—show that the brain becomes desensitized with each successive lie.

In other words, the more someone lies, the less that person’s brain reacts to it.  And the easier it is for him or her to lie the next time.

These researchers concluded that “little white lies,” usually considered harmless, really aren’t harmless at all because they can lead to big fat falsehoods.  “What begins as small acts of dishonesty can escalate into larger transgressions.”

This study seems terribly relevant right now.  Our political leaders (one in particular, along with some of his cohorts) have often been caught telling lies.   When these leaders set out on a course of telling lies, watch out.  They’re likely to keep doing it.  And it doesn’t bother them a bit.

Let’s hope our free press remains truly free, ferrets out the lies that impact our lives, and points them out to the rest of us whenever they can.

[This study was published in the journal Nature Neuroscience and noted in the January-February 2017 issue of the AARP Bulletin.]

 

On language

When did “waiting for” become “waiting on”?

Am I the only English-speaking person who still says “waiting for”?

I’ve been speaking English my entire life, and the phrase “waiting on” has always meant what waiters or waitresses did.  Likewise, salesclerks in a store.  They “waited on” you.

“Waiting for” was an entirely different act.   In a restaurant, you—the patron—decide to order something from the menu.  Then you begin “waiting for” it to arrive.

Similarly:  Even though you’re ready to go somewhere, don’t you sometimes have to “wait for” someone before you can leave?

Here are three titles you may have come across.  First, did you ever hear of the 1935 Clifford Odets play “Waiting for Lefty”?  (Although it isn’t performed a lot these days, it recently appeared on stage in the Bay Area.)  In Odets’s play, a group of cabdrivers “wait for” someone named Lefty to arrive.  While they wait for him, they debate whether they should go on strike.

Even better known, Samuel Beckett’s play, “Waiting for Godot,” is still alive and well and being performed almost everywhere.  [You can read a little bit about this play—and the two pronunciations of “Godot”—in my blog post, “Crawling through Literature in the Pubs of Dublin, Ireland,” published in April 2016.]  The lead characters in the play are forever waiting for “Godot,” usually acknowledged as a substitute for “God,” who never shows up.

A more recent example is the 1997 film, “Waiting for Guffman.”  The cast of a small-town theater group anxiously waits for a Broadway producer named Guffman to appear, hoping that he’ll like their show.  Christopher Guest and Eugene Levy, who co-wrote and starred in the film, were pretty clearly referring to “Waiting for Godot” when they wrote it.

Can anyone imagine replacing Waiting for” in these titles with “Waiting on”?

C’mon!

Yet everywhere I go, I constantly hear people say that they’re “waiting on” a friend to show up or “waiting on” something to happen.

This usage has even pervaded Harvard Magazine.  In a recent issue, an article penned by an undergraduate included this language:  “[T]hey aren’t waiting on the dean…to make the changes they want to see.”

Hey, undergrad, I’m not breathlessly waiting for your next piece of writing!  Why?  Because you should have said “waiting for”!

Like many of the changes in English usage I’ve witnessed in recent years, this one sounds very wrong to me.

 

Have you heard this one?

Thanks to scholars at the U. of Pennsylvania’s Wharton School and Harvard Business School, I’ve just learned that workers who tell jokes—even bad ones—can boost their chances of being viewed by their co-workers as more confident and more competent.

Joking is a form of humor, and humor is often seen as a sign of intelligence and a good way to get ideas across to others.  But delivering a joke well also demands sensitivity and some regard for the listeners’ emotions.

The researchers, who ran experiments involving 2,300 participants, were trying to gauge responses to joke-tellers. They specifically wanted to assess the impact of joking on an individual’s status at work.

In one example, participants had to rate individuals who explained a service that removed pet waste from customers’ yards.  This example seems ripe for joke-telling, and sure enough, someone made a joke about it.

Result?  The person who told the joke was rated as more competent and higher in status than those who didn’t.

In another example, job-seekers were asked to suggest a creative use for an old tire.  One of them joked, “Someone doing CrossFit could use it for 30 minutes, then tell you about it forever.”  This participant was rated higher in status than two others, who either made an inappropriate joke about a condom or made a serious suggestion (“Make a tire swing out of it.”).

So jokes work—but only if they’re appropriate.

Even jokes that fell flat led participants to rate a joke-teller as highly confident.  But inappropriate or insensitive jokes don’t do a joke-teller any favors because they can have a negative impact.

Common sense tells me that the results of this study also apply in a social setting.  Telling jokes to your friends is almost always a good way to enhance your relationship—as long as you avoid offensive and insensitive jokes.

The take-away:  If you can tell an appropriate joke to your colleagues and friends, they’re likely to see you as confident and competent.

So next time you need to explain something to others, in your workplace or in any another setting, try getting out one of those dusty old joke books and start searching for just the right joke.

[This study, reported in The Wall Street Journal on January 18, 2017, and revisited in the same publication a week later, appeared in the Journal of Personality and Social Psychology.]

Hamilton, Hamilton…Who Was He Anyway?

Broadway megahit “Hamilton” has brought the Founding Parent (okay, Founding Father) into a spotlight unknown since his own era.

Let’s face it.  The Ron Chernow biography, turned into a smash Broadway musical by Lin-Manuel Miranda, has made Alexander Hamilton into the icon he hasn’t been–or maybe never was–in a century or two. Just this week, the hip-hop musical “Hamilton” received a record-breaking 16 Tony Award nominations.

His new-found celebrity has even influenced his modern-day successor, current Treasury Secretary Jack Lew, leading Lew to reverse his earlier plan to remove Hamilton from the $10 bill and replace him with the image of an American woman.

Instead, Hamilton will remain on the front of that bill, with a group representing suffragette leaders in 1913 appearing on the back, while Harriet Tubman will replace no-longer-revered and now-reviled President Andrew Jackson on the front of the $20 bill.  We’ll see other changes to our paper currency during the next five years.

But an intriguing question remains:  How many Americans—putting aside those caught up in the frenzy on Broadway, where theatergoers are forking over $300 and $400 to see “Hamilton” on stage—know who Hamilton really was?

A recent study done by memory researchers at Washington University in St. Louis has confirmed that most Americans are confident that Hamilton was once president of the United States.

According to Henry L. Roediger III, a human memory expert at Wash U, “Our findings from a recent survey suggest that about 71 percent of Americans are fairly certain that [Hamilton] is among our nation’s past presidents.  I had predicted that Benjamin Franklin would be the person most falsely recognized as a president, but Hamilton beat him by a mile.”

Roediger (whose official academic title is the James S. McDonnell Distinguished University Professor in Arts & Sciences) has been testing undergrad college students since 1973, when he first administered a test while he was himself a psychology grad student at Yale. His 2014 study, published in the journal Science, suggested that we as a nation do fairly well at naming the first few and the last few presidents.  But less than 20 percent can remember more than the last 8 or 9 presidents in order.

Roediger’s more recent study is a bit different because its goal was to gauge how well Americans simply recognize the names of past presidents.  Name-recognition should be much less difficult than recalling names from memory and listing them on a blank sheet of paper, which was the challenge in 2014.

The 2016 study, published in February in the journal Psychological Science, asked participants to identify past presidents, using a list of names that included actual presidents as well as famous non-presidents like Hamilton and Franklin.  Other familiar names from U.S. history, and non-famous but common names, were also included.

Participants were asked to indicate their level of certainty on a scale from zero to 100, where 100 was absolutely certain.

What happened?  The rate for correctly recognizing the names of past presidents was 88 percent overall, although laggards Franklin Pierce and Chester Arthur rated less than 60 percent.

Hamilton was more frequently identified as president (with 71 percent thinking that he was) than several actual presidents, and people were very confident (83 on the 100-point scale) that he had been president.

More than a quarter of the participants incorrectly recognized others, notably Franklin, Hubert Humphrey, and John Calhoun, as past presidents.  Roediger thinks that probably happened because people are aware that these were important figures in American history without really knowing what their actual roles were.

Roediger and his co-author, K. Andrew DeSoto, suggest that our ability to recognize the names of famous people hinges on their names appearing in a context related to the source of their fame.  “Elvis Presley was famous, but he would never be recognized as a past president,” Roediger says.   It’s not enough to have a familiar name.  It must be “a familiar name in the right context.”

This study is part of an emerging line of research focusing on how people remember history.  The recent studies reveal that the ability to remember the names of presidents follows consistent and reliable patterns.  “No matter how we test it—in the same experiment, with different people, across generations, in the laboratory, with online studies, with different types of tests—there are clear patterns in how the presidents are remembered and how they are forgotten,” DeSoto says.

While decades-old theories about memory can explain the results to some extent, these findings are sparking new ideas about fame and just how human memory-function treats those who achieve it.

As Roediger notes, “knowledge of American presidents is imperfect….”  False fame can arise from “contextual familiarity.”  And “even the most famous person in America may be forgotten in as short a time as 50 years.”

So…how will Alexander Hamilton’s new-found celebrity hold up?  Judging from the astounding success of the hip-hop musical focusing on him and his cohorts, one can predict with some confidence that his memory will endure far longer than it otherwise might have.

This time, he may even be remembered as our first Secretary of the Treasury, not as the president he never was.