Category Archives: human memory

Remembering Stuff

Are you able to remember stuff pretty well?  If you learned that stuff quickly, you have a very good chance of retaining it.  Even if you spent less time studying it than you might have.

These conclusions arise from a new study by psychologists at Washington University in St. Louis.   According to its lead author, Christopher L. Zerr, “Quicker learning appears to be more durable learning.”

The study, published in the journal Psychological Science, tried a different way to gauge differences in how quickly and well people learn and retain information.  Using word-pairs that paired English with a difficult-to-learn language, Lithuanian, the researchers created a “learning-efficiency score” for each participant.

“In each case, initial learning speed proved to be a strong predictor of long-term retention,” said senior author Kathleen B. McDermott, professor of psychological and brain sciences at Washington University.

46 of the participants returned for a follow-up study three years later.  The results confirmed the earlier study’s results.

What explains this outcome?  The researchers suggest two possibilities.

First, individuals may differ because those with better attention-control can be more effective while learning material, thus avoiding distraction and forgetting.  Another explanation:  efficient learners use more effective learning strategies, like using a key word to relate two words in a pair.

The researchers don’t think their job is done.  Instead, they’d like to see future research on learning efficiency that would have an impact in educational and clinical settings.

The goal is to be able to teach students how to be efficient learners, and to forestall the effects of disease, aging, and neuropsychological disorders on learning and retention.

Conclusion:  If you’ve always been a quick learner, that’s probably stood you in good stead, enabling you to remember stuff you learned quickly in the first place.

 

[This blog post is not the one I originally intended to write this month, when I planned to focus on how important it is to vote in the midterm elections in November.  Publishing my new novel, RED DIANA, this month has kept me from writing that post, but I hope to publish it at some point.  It would be something of a reprise of a post I published in September 2014, “What Women Need to Do.”]

Proms and “The Twelfth of Never”

It’s prom season in America.

Do you remember your senior prom?

The twelfth of June never fails to remind me of mine.

The prom committee named our prom “The Twelfth of Never,” and it’s easy to remember why.  The prom took place on June 12th.  The name was also that of a popular song recorded by Johnny Mathis–one of my favorites on his album, “Johnny’s Greatest Hits.”

As one of Johnny’s fans, I owned this album and played it over and over till I knew the words to all of the songs, including this one.  Many of his songs became standards, and PBS has recently been showcasing his music in one of its most appealing fund-raising lures.

I immortalized the song title in my own small way by writing in my novel Jealous Mistress that the protagonist, Alison Ross, hears it playing while she shops in her supermarket in 1981: “My fellow shoppers were gliding up and down the aisles of the Jewel, picking items off shelves to the tune of ‘The Twelfth of Never.’”

When I was 11 or 12, my favorite crooner was Eddie Fisher, who was then at the top of his game.  But by my last year of high school, I’d shifted my loyalties to Johnny Mathis and Harry Belafonte.  In addition to Johnny’s album, I treasured Belafonte’s astonishing “Belafonte” LP and played it, like Johnny’s, over and over, learning those words, too.

Although I wasn’t part of the prom committee (I was busy chairing the luncheon committee), and “the twelfth of never” referred to a date when something was never going to happen, I was okay with the name the committee chose.  My more pressing concern was who would be my date.  Would it be my current crush, a friend since first grade who’d metamorphosed into the man of my dreams?  (I hoped so.)  Would it be last year’s junior prom date?  (I hoped not.)  Who exactly would it be?

As luck would have it, an amiable and very bright classmate named Allen stepped forward and asked me to go to the prom.  I could finally relax on that score.  But we weren’t really on the same wave length.  When we went on a few other dates before prom, they became increasingly awkward.

On one date we saw “Some Like It Hot” at a filled-to-capacity downtown Chicago movie theater, where we sat in the last row of the balcony.  The film was terrific (it’s been judged the top comedy film of all time by the American Film Institute), and Allen clearly loved it.  His delight unfortunately ended in an ache or two.  When he heard the last line, spoken by Joe E. Brown to Jack Lemmon (“Well, nobody’s perfect”), Allen laughed uproariously, threw his head back, and hit it on the wall behind our seats.  I felt sorry for him—it must have hurt—but it was still pretty hard to stifle a laugh.  (I don’t think it hurt his brainpower, though.  As I recall, Allen went on to enroll at MIT.)

Although the bloom was off the rose by the time the prom came along, Allen and I went off happily together to dance on the ballroom floor of the downtown Knickerbocker Hotel, noted for the floor’s colored lights.  (The Knickerbocker spent the 1970s as the icky Playboy Towers but since then reverted to its original name.)  We then proceeded to celebrate some more by watching the remarkable ice-skating show offered on a tiny rink surrounded by tables filled with patrons, like a bunch of us prom-goers, at still another big hotel downtown.

Most of us were unknowingly living through an era of innocence.  For some of my classmates, the prom may have involved heavy kissing, but I doubt that much more than that happened.  In my case, absolutely nothing happened except for a chaste kiss at the end of the evening.

For better or worse, proms have evolved into a whole different scene.  In April, The Wall Street Journal noted that although the rules of prom used to be simple, they’re more complicated today.  At Boylan Catholic High School in Illinois, for example, a 21-page rulebook governs acceptable prom-wear.  Other schools require pre-approval of the prom dresses students plan to wear–in one school by a coach, in another by a three-person committee.

Administrators add new rules every year “to address new trends and safety concerns.” These have included banning canes, boys’ ponytails, and saggy pants, as well as two-piece dresses that might reveal midriffs and dresses with mesh cutouts that suggest bare skin.

But students have begun to revolt.  The students at Boylan Catholic have organized their own prom, arguing that the 21-page dress code contributed to body-shaming.  They point to a rule that states: “Some girls may wear the same dress, but due to body types, one dress may be acceptable while the other is not.”  A male student who helped organize Morp (the alternative prom) said that “girls were offended…. Somebody needed to step up and do something.”

At a school in Alabama, one student hoped to take his grandmother to his prom since she’d never been to one, but her age exceeded the maximum of 20, so she wasn’t allowed to go.  The student was “mad,” skipped the school prom, and celebrated at his grandmother’s home instead.  Not surprisingly, the school defended its rule, stating that it wanted to discourage students’ inviting older relatives who might present a safety issue by drinking alcohol:  “It just causes problems.”  But the school district later joined with a senior center to host an annual prom for senior citizens.  Presumably, Granny went to a prom after all.

According to the Journal, New York City students have another option altogether.  The New York Public Library hosts an annual free “Anti-Prom” in June for students 12 to 18, who can attend in any garb they choose.

In the Bay Area, another phenomenon has occurred:  “promposals”–photos and videos posted on social media in which one student asks another one to prom.  The San Francisco Chronicle views these as a way for kids “to turn themselves into YouTube, Twitter and Instagram sensations.”  In 2014, a boy trotted up to school on a horse, holding a sign that asked his girlfriend to “ride to prom” with him.  Last year, a kid built a makeshift “castle” and wrote a Shakespearean-style play to ask a friend to prom.  And in Berkeley, a boy choreographed a hip-hop dance routine with a bunch of other kids and performed it for his hoped-for date in front of 200 classmates.

In April, the Chronicle reported data on the national emergence of promposals.  From only 17 on Twitter in 2009, the number grew to 764,000 in 2015, while on YouTube, videos went from 56,000 in 2009 to 180,000 last year.  (Millions of teens also post pictures about the prom itself on Instagram.)  The promposal phenomenon may be dying down, with fewer elaborate ones noted this year at a school in Oakland.  But who knows?

One thing we know for certain:  The high school prom-scene has changed.

But even though things have changed, prom-goers today are still teenagers much like us when we went to prom, with all of the insecurities and anxieties that go along with being a teen.

For me, mostly-happy memories of “The Twelfth of Never” return every year on the twelfth of June.   Maybe mostly-happy, or not-so-happy, memories of your prom return every year as well.

As Johnny’s song reminds us, our memories of prom can endure for “a long, long time.”

Rudeness: A Rude Awakening

Rudeness seems to be on the rise.  Why?

Being rude rarely makes anyone feel better.  I’ve often wondered why people in professions where they meet the public, like servers in a restaurant, decide to act rudely, when greeting the public with a more cheerful demeanor probably would make everyone feel better.

Pressure undoubtedly plays a huge role.  Pressure to perform at work and pressure to get everywhere as fast as possible.  Pressure can create a high degree of stress–the kind of stress that leads to unfortunate results.

Let’s be specific about “getting everywhere.”  I blame a lot of rude behavior on the incessantly increasing traffic many of us are forced to confront.  It makes life difficult, even scary, for pedestrians as well as drivers.

How many times have you, as a pedestrian in a crosswalk, been nearly swiped by the car of a driver turning way too fast?

How many times have you, as a driver, been cut off by arrogant drivers who aggressively push their way in front of your car, often violating the rules of the road?  The extreme end of this spectrum:  “road rage.”

All of these instances of rudeness can, and sometimes do, lead to fatal consequences.  But I just came across several studies documenting far more worrisome results from rude behavior:  serious errors made by doctors and nurses as a result of rudeness.

The medical profession is apparently concerned about rude behavior within its ranks, and conducting these studies reflects that concern.

One of the studies was reported on April 12 in The Wall Street Journal, which concluded that “rudeness [by physicians and nurses] can cost lives.”  In this simulated-crisis study, researchers in Israel analyzed 24 teams of physicians and nurses who were providing neonatal intensive care.  In a training exercise to diagnose and treat a very sick premature newborn, one team would hear a statement by an American MD who was observing them that he was “not impressed with the quality of medicine in Israel” and that Israeli medical staff “wouldn’t last a week” in his department. The other teams received neutral comments about their work.

Result?  The teams exposed to incivility made significantly more errors in diagnosis and treatment.  The members of these teams collaborated and communicated with each other less, and that led to their inferior performance.

The professor of medicine at UCSF who reviewed this study for The Journal, Dr. Gurpreet Dhallwal, asked himself:  How can snide comments sabotage experienced clinicians?  The answer offered by the authors of the study:  Rudeness interferes with working memory, the part of the cognitive system where “most planning, analysis and management” takes place.

So, as Dr. Dhallwal notes, being “tough” in this kind of situation “sounds great, but it isn’t the psychological reality—even for those who think they are immune” to criticism.  “The cloud of negativity will sap resources in their subconscious, even if their self-affirming conscious mind tells them otherwise.”

According to a researcher in the Israeli study, many of the physicians weren’t even aware that someone had been rude.  “It was very mild incivility that people experience all the time in every workplace.”  But the result was that “cognitive resources” were drawn away from what they needed to focus on.

There’s even more evidence of the damage rudeness can cause.  Dr. Perri Klass, who writes a column on health care for The New York Times, has recently reviewed studies of rudeness in a medical setting.  Dr. Klass, a well-known pediatrician and writer, looked at what happened to medical teams when parents of sick children were rude to doctors.  This study, which also used simulated patient-emergencies, found that doctors and nurses (also working in teams in a neonatal ICU) were less effective–in teamwork, communication, and diagnostic and technical skills–after an actor playing a parent made a rude remark.

In this study, the “mother” said, “I knew we should have gone to a better hospital where they don’t practice Third World medicine.”  Klass noted that even this “mild unpleasantness” was enough to affect the doctors’ and nurses’ medical skills.

Klass was bothered by these results because even though she had always known that parents are sometimes rude, and that rudeness can be upsetting, she didn’t think that “it would actually affect my medical skills or decision making.”  But in light of these two studies, she had to question whether her own skills and decisions may have been affected by rudeness.

She noted still other studies of rudeness.  In a 2015 British study, “rude, dismissive and aggressive communication” between doctors affected 31 percent of them.  And studies of rudeness toward medical students by attending physicians, residents, and nurses also appeared to be a frequent problem.  Her wise conclusion:  “In almost any setting, rudeness… [tends] to beget rudeness.”  In a medical setting, it also “gets in the way of healing.”

Summing up:  Rudeness is out there in every part of our lives, and I think we’d all agree that rudeness is annoying.  But it’s too easy to view it as merely annoying.  Research shows that it can lead to serious errors in judgment.

In a medical setting, on a busy highway, even on city streets, it can cost lives.

We all need to find ways to reduce the stress in our daily lives.  Less stress equals less rudeness equals fewer errors in judgment that cost lives.

Hamilton, Hamilton…Who Was He Anyway?

Broadway megahit “Hamilton” has brought the Founding Parent (okay, Founding Father) into a spotlight unknown since his own era.

Let’s face it.  The Ron Chernow biography, turned into a smash Broadway musical by Lin-Manuel Miranda, has made Alexander Hamilton into the icon he hasn’t been–or maybe never was–in a century or two. Just this week, the hip-hop musical “Hamilton” received a record-breaking 16 Tony Award nominations.

His new-found celebrity has even influenced his modern-day successor, current Treasury Secretary Jack Lew, leading Lew to reverse his earlier plan to remove Hamilton from the $10 bill and replace him with the image of an American woman.

Instead, Hamilton will remain on the front of that bill, with a group representing suffragette leaders in 1913 appearing on the back, while Harriet Tubman will replace no-longer-revered and now-reviled President Andrew Jackson on the front of the $20 bill.  We’ll see other changes to our paper currency during the next five years.

But an intriguing question remains:  How many Americans—putting aside those caught up in the frenzy on Broadway, where theatergoers are forking over $300 and $400 to see “Hamilton” on stage—know who Hamilton really was?

A recent study done by memory researchers at Washington University in St. Louis has confirmed that most Americans are confident that Hamilton was once president of the United States.

According to Henry L. Roediger III, a human memory expert at Wash U, “Our findings from a recent survey suggest that about 71 percent of Americans are fairly certain that [Hamilton] is among our nation’s past presidents.  I had predicted that Benjamin Franklin would be the person most falsely recognized as a president, but Hamilton beat him by a mile.”

Roediger (whose official academic title is the James S. McDonnell Distinguished University Professor in Arts & Sciences) has been testing undergrad college students since 1973, when he first administered a test while he was himself a psychology grad student at Yale. His 2014 study, published in the journal Science, suggested that we as a nation do fairly well at naming the first few and the last few presidents.  But less than 20 percent can remember more than the last 8 or 9 presidents in order.

Roediger’s more recent study is a bit different because its goal was to gauge how well Americans simply recognize the names of past presidents.  Name-recognition should be much less difficult than recalling names from memory and listing them on a blank sheet of paper, which was the challenge in 2014.

The 2016 study, published in February in the journal Psychological Science, asked participants to identify past presidents, using a list of names that included actual presidents as well as famous non-presidents like Hamilton and Franklin.  Other familiar names from U.S. history, and non-famous but common names, were also included.

Participants were asked to indicate their level of certainty on a scale from zero to 100, where 100 was absolutely certain.

What happened?  The rate for correctly recognizing the names of past presidents was 88 percent overall, although laggards Franklin Pierce and Chester Arthur rated less than 60 percent.

Hamilton was more frequently identified as president (with 71 percent thinking that he was) than several actual presidents, and people were very confident (83 on the 100-point scale) that he had been president.

More than a quarter of the participants incorrectly recognized others, notably Franklin, Hubert Humphrey, and John Calhoun, as past presidents.  Roediger thinks that probably happened because people are aware that these were important figures in American history without really knowing what their actual roles were.

Roediger and his co-author, K. Andrew DeSoto, suggest that our ability to recognize the names of famous people hinges on their names appearing in a context related to the source of their fame.  “Elvis Presley was famous, but he would never be recognized as a past president,” Roediger says.   It’s not enough to have a familiar name.  It must be “a familiar name in the right context.”

This study is part of an emerging line of research focusing on how people remember history.  The recent studies reveal that the ability to remember the names of presidents follows consistent and reliable patterns.  “No matter how we test it—in the same experiment, with different people, across generations, in the laboratory, with online studies, with different types of tests—there are clear patterns in how the presidents are remembered and how they are forgotten,” DeSoto says.

While decades-old theories about memory can explain the results to some extent, these findings are sparking new ideas about fame and just how human memory-function treats those who achieve it.

As Roediger notes, “knowledge of American presidents is imperfect….”  False fame can arise from “contextual familiarity.”  And “even the most famous person in America may be forgotten in as short a time as 50 years.”

So…how will Alexander Hamilton’s new-found celebrity hold up?  Judging from the astounding success of the hip-hop musical focusing on him and his cohorts, one can predict with some confidence that his memory will endure far longer than it otherwise might have.

This time, he may even be remembered as our first Secretary of the Treasury, not as the president he never was.