Category Archives: Washington University in St. Louis

My Life as a Shopper

I have a new outlook on shopping.  I’m no longer shopping the way I used to.

Why?

I’ll start at the beginning.  My long history of shopping began when I was very young.

My parents were both immersed in retailing.  My mother’s parents immigrated to Chicago from Eastern Europe and, soon after arriving, opened a clothing store on Milwaukee Avenue.  Their enterprise evolved into a modest chain of women’s apparel stores, and throughout her life my mother was intimately involved in the business.  She embedded in me the ethos that shopping for new things, especially clothes, was a good thing.  Under her influence, I gave away countless wearable items of clothing in favor of getting something new, preferably something sold in one of her family’s stores.  (I later regretted departing with some of the perfectly good items I could have continued to wear for many more years.)

Even though my father received a degree in pharmacy from the University of Illinois, and he enjoyed some aspects of his work as a pharmacist, he was himself attracted to retailing.  At a young age, he opened his own drugstore on the South Side of Chicago (I treasure a black-and-white photo of him standing in front of his store’s window).  After marrying my mother, he spent a number of years working in her family’s business, and in the late ‘40s the two of them opened a women’s clothing boutique on Rush Street, a short distance from Oak Street, in a soon-to-be-trendy shopping area.  Ahead of its time, the boutique quickly folded, but Daddy never lost his taste for retailing.

In view of this history, I was fated to become a “shopper.”  After Daddy died when I was 12, our family wasn’t able to spend big wads of money on anything, including clothes.  But my mother’s inclination to buy new clothes never really ceased.

Thanks to generous scholarship and fellowship awards, I made my way through college and grad school on a miniscule budget.  I saved money by spending almost nothing, savoring the 99-cent dinner at Harkness Commons almost every night during law school to save money.  And because I began my legal career with a $6,000 annual salary as a federal judge’s law clerk and, as a lawyer, never pursued a high-paying job (I preferred to work on behalf of the poor, for example), I got by without big-time shopping.

Marriage brought little change at first.  My darling new husband also came from a modest background and was not a big spender, even when our salaries began to move up a bit.

But things eventually changed.  Higher salaries and the arrival of new retail chain stores featuring bargain prices made buying stuff much more tempting.  I needed presentable clothes for my new full-time jobs.  Our daughters needed to be garbed in clothes like those the other kids wore.  Our living room chairs from Sears began to look shabby, propelling us toward somewhat better home décor.

A raft of other changes led me to spend more time shopping.  My boring law-firm jobs were more tolerable if I could escape during my lunch hour and browse at nearby stores.  The rise of outlet malls made bargain shopping easier than ever.  And travels to new cities and countries inspired buying small, easily packable items, like books and jewelry.

After I moved to San Francisco, having jettisoned possessions I’d lived with for years in my former home, I needed to acquire new ones.  So there I was, buying furniture and kitchen equipment for my sunny new apartment.

At the same time, our consumption-driven culture continued to push buying more and more, including the “fast-fashion” that emerged, offering stylish clothes at a temptingly low price.

But this emphasis on acquiring new stuff, even low-priced stuff, has finally lost its appeal.

I’ve come to realize that I don’t need it.

My overall goal is to simplify my life.  This means giving away a lot of things I don’t need, like stacks of books I’ll never read and charming bric-a-brac that’s sitting on a shelf collecting dust.  Like clothes that a disadvantaged person needs more than I do.

My new focus:  First, use what I already have.  Next, do not buy anything new unless I absolutely need it.

Choosing not to acquire new clothes—in essence, reusing what I already have, adopting the slogan “shop your closet”–is a perfect example of my new outlook.

I’ve previously written about confining one’s new purchases to “reunion-worthy” clothes.  [Please see my blog post of October 12, 2017, advising readers to choose their purchases carefully, making sure that any clothes they buy are flattering enough to wear at a school reunion.]

But that doesn’t go far enough.  New purchases should be necessary.

I find that I’m not alone in adopting this approach.

Many millennials have eschewed buying consumer goods, opting for new experiences instead of new material things.  I guess I agree with the millennials’ outlook on this subject.

Here’s other evidence of this approach.  An article in The Guardian in July 2019 shouted “’Don’t feed the monster!’ The people who have stopped buying new clothes.”  Writer Paula Cocozza noted the growing number of people who love clothes but resist buying new ones because of the lack of their sustainability:  Many consumers she interviewed were switching to second-hand shopping so they would not perpetuate this consumption and waste.

Second-hand shopping has even taken off online.  In September, the San Francisco Chronicle noted the “wave of new resale apps and marketplaces” adding to longtime resale giants like eBay.  At the same time, The New York Times, covering Fashion Week in Milan, wrote that there was “a lot of talk about sustainability over the last two weeks of collections, and about fashion’s role in the climate crisis.”  The Times added:  “the idea of creating clothes that last—that people want to buy and actually keep, keep wearing and never throw out, recycle or resell”—had become an important part of that subject.  It quoted Miuccia Prada, doyenne of the high-end clothing firm Prada:  “we need to do less.  There is too much fashion, too much clothes, too much of everything.”

Enter Tatiana Schlossberg and her new book, Inconspicuous consumption:  the environmental impact you don’t know you have (2019).  In the middle of an absorbing chapter titled Fashion, she notes that “There’s something appealing about being able to buy really cheap, fashionable clothing [..,] but it has given us a false sense of inexpensiveness.  It’s not only that the clothes are cheap; it’s that no one is paying for the long-term costs of the waste we create just from buying as much as we can afford….”

Some scholars have specifically focused on this issue, the “overabundance of fast fashion—readily available, inexpensively made new clothing,” because it has created “an environmental and social justice crisis.”  Christine Ekenga, an assistant professor at Washington University in St. Louis, has co-authored a paper focused on the “global environmental injustice of fast fashion,” asserting that the fast-fashion supply chain has created a dilemma.  While consumers can buy more clothes for less, those who work in or live near textile-manufacturing bear a disproportionate burden of environmental health hazards.  Further, millions of tons of textile waste sit in landfills and other settings, hurting low-income countries that produce many of these clothes.  In the U.S., about 85 percent of the clothing Americans consume–nearly 80 pounds per American per year–is sent to landfills as solid waste.  [See “The Global Environmental Injustice of Fast Fashion” in the journal Environmental Health.]

A high-profile public figure had an epiphany along the same lines that should influence all of us.  The late Doug Tompkins was one of the founders of The North Face and later moved on to help establish the apparel chain Esprit.  At the height of Esprit’s success, he sold his stake in the company for about $150 million and moved to Chile, where he embraced a whole new outlook on life and adopted an important new emphasis on ecology.  He bought up properties for conservation purposes, in this way “paying my rent for living on the planet.”  Most tellingly, he said, “I left that world of making stuff that nobody really needed because I realized that all of this needless overconsumption is one of the driving forces of the [environmental] crisis, the mother of all crises.”  [Sierra magazine, September/October 2019.]

Author Marie Kondo fits in here.  She has earned fame as a de-cluttering expert, helping people who feel overwhelmed with too much stuff to tidy up their homes.  Her focus is on reducing clutter that’s already there, so she doesn’t zero in on new purchases.  But I applaud her overall outlook.  As part of de-cluttering, she advises:  As you consider keeping or letting go of an item, hold it in your hands and ask:  “Does this item bring me joy?”  This concept of ensuring that an item brings you joy could apply to new purchases as well, so long as the item bringing you joy is also one you really need.

What should those of us enmeshed in our consumer culture do?  In The Wall Street Journal in July 2019, April Lane Benson, a “shopping-addiction-focused psychologist and the author of ‘To Buy or Not to Buy:  Why We Overshop and How to Stop’,” suggested that if a consumer is contemplating a purchase, she should ask herself six simple questions:  “Why am I here? How do I feel? Do I need this? What if I wait? How will I pay for it? Where will I put it?”

Benson’s list of questions is a good one.  Answering them could go a long way toward helping someone avoid making a compulsive purchase.  But let’s remember:  Benson is talking about a shopper already in a store, considering whether to buy something she’s already selected in her search for something new.  How many shoppers will interrupt a shopping trip like that to answer Benson’s questions?

I suggest a much more ambitious scheme:  Simply resolve not to buy anything you don’t need!

My 11-year-old granddaughter has the right idea:  She’s a minimalist who has rejected any number of gifts from me, including some fetching new clothes, telling me she doesn’t need them.

When I reflect on my life as a shopper, I now understand why and how I became the shopper I did.  Perhaps, in light of my family history and the increasingly consumption-driven culture I’ve lived through, I didn’t really have an option.

But I have regrets:  I’ve wasted countless hours browsing in stores, looking through racks and poring over shelves for things to buy, much of which I didn’t need, then spending additional hours returning some of the things I had just purchased.

These are hours I could have spent far more wisely.  Pursuing my creative work, exercising more often and more vigorously, doing more to help those in need.

Readers:  Please don’t make the mistakes I have.  Adopt my new philosophy.  You’ll have many more hours in your life to pursue far more rewarding goals than acquiring consumer goods you don’t really need.

 

 

 

Remembering Stuff

Are you able to remember stuff pretty well?  If you learned that stuff quickly, you have a very good chance of retaining it.  Even if you spent less time studying it than you might have.

These conclusions arise from a new study by psychologists at Washington University in St. Louis.   According to its lead author, Christopher L. Zerr, “Quicker learning appears to be more durable learning.”

The study, published in the journal Psychological Science, tried a different way to gauge differences in how quickly and well people learn and retain information.  Using word-pairs that paired English with a difficult-to-learn language, Lithuanian, the researchers created a “learning-efficiency score” for each participant.

“In each case, initial learning speed proved to be a strong predictor of long-term retention,” said senior author Kathleen B. McDermott, professor of psychological and brain sciences at Washington University.

46 of the participants returned for a follow-up study three years later.  The results confirmed the earlier study’s results.

What explains this outcome?  The researchers suggest two possibilities.

First, individuals may differ because those with better attention-control can be more effective while learning material, thus avoiding distraction and forgetting.  Another explanation:  efficient learners use more effective learning strategies, like using a key word to relate two words in a pair.

The researchers don’t think their job is done.  Instead, they’d like to see future research on learning efficiency that would have an impact in educational and clinical settings.

The goal is to be able to teach students how to be efficient learners, and to forestall the effects of disease, aging, and neuropsychological disorders on learning and retention.

Conclusion:  If you’ve always been a quick learner, that’s probably stood you in good stead, enabling you to remember stuff you learned quickly in the first place.

 

[This blog post is not the one I originally intended to write this month, when I planned to focus on how important it is to vote in the midterm elections in November.  Publishing my new novel, RED DIANA, this month has kept me from writing that post, but I hope to publish it at some point.  It would be something of a reprise of a post I published in September 2014, “What Women Need to Do.”]

How Big Is Your Signature?

I’ll bet you never thought that the size of your signature meant a darned thing.  But guess what.  It does.

A recent study by a business school professor at Washington University in St. Louis found that an oversized signature is correlated with an oversized ego, and a great big ego can have practical business consequences.

In my view, the research findings apply to people running enterprises in a whole lot of areas other than corporations, including the White House.

The research by Professor Chad Ham, a professor at Wash U’s Olin Business School, concluded that a CFO (chief financial officer) with an oversized signature is more likely to make questionable choices because of his/her oversized ego.

Professor Ham’s research, described in a paper published in the December issue of the Journal of Accounting Research, showed that “narcissistic CFOs are less likely to recognize losses” promptly, and that this behavior is “consistent with a willingness to cover up past mistakes.”

Ham’s research team, which included academics from the University of North Carolina, the University of Maryland, and Rice University, looked at the size of a CFO’s signature and how it related to his/her level of narcissism. They then connected those findings to the financial reporting by each CFO’s corporation.

The research team compared the performance of the companies before and after the CFO was appointed.  The result was telling.  The firms became “more aggressive” when narcissistic CFOs were appointed.

Ham’s research went beyond that of CFOs.  It also looked at CEOs and linked management results with their level of narcissism.  Narcissistic CEOs—those with larger signatures—tended to over-invest in riskier projects and received higher compensation despite poor financial performance.

Why did the researchers focus so much on signature size?  According to Ham, it was one way to “capture narcissism.”  They couldn’t simply ask top corporate executives to submit to a personality test.  So they used signature size as a proxy for the level of narcissism.

In other research, the team paired student volunteers, and asked them to allocate $5 between themselves and their anonymous partners.   After that assignment, the students had to fill out a personality test and sign their names.  The result?  Students with larger signatures tended to be more narcissistic, and the more narcissistic participants were more likely to keep a larger share of the $5 for themselves and to misreport how they allocated the other half of that amount.

Ham doesn’t advocate that corporate boards avoid hiring as CEOs and CFOs everyone with a large signature.  It’s not that simple.  A single narcissistic CFO might benefit the company in ways not measured in this study.  But Ham concluded that corporate leaders should be aware of their colleagues’ narcissistic tendencies in order to keep “appropriate checks and balances in place.”

Wow.  I think this research can be extrapolated to the heads of entities other than corporations.  For example, the current occupant of the White House likes to show off his oversized signature on whatever documents he signs.  His large signature is just one indication of his level of narcissism, which is related to his eagerness to embark in risky projects, to misreport how he allocates funds, and to cover up his past mistakes.

As for keeping appropriate “checks and balances in place,” we might as well forget it.  Where are the other leaders of our government—in particular, those currently belonging to the majority party in Congress–who could exhibit some strength of character and insist that “checks and balances,” a long-held principle of our democracy, are observed?

They’re nowhere to be found.

I’m wondering just how big their signatures are.

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

Punting on the Cam

The keys to my front door reside on a key ring I bought in Cambridge, England, on a magical day in September 1986.  It’s one of the souvenir key rings you used to find in Britain (and maybe still can, though I didn’t see any during a visit in 2012).  They were fashioned in leather and emblazoned in gold leaf with the name and design of a notable site.

During trips to London and elsewhere in Britain during the 1980s and ‘90s, I acquired a host of these key rings. One of my favorites was a bright red one purchased at Cardiff Castle in Wales in 1995.  I would carry one of them in my purse until the gold design wore off and the leather became so worn that it began to fall apart.

Until recently, I thought I had used every one of these leather key rings.  But recently, in a bag filled with souvenir key rings, I came across the one I bought in Cambridge in 1986.  There it was, in all of its splendor:  Black leather emblazoned with the gold-leaf crest of King’s College, Cambridge.

I began using it right away, and the gold design is already fading.  But my memories of that day in Cambridge will never fade.

My husband Herb had gone off to Germany to attend a math conference while I remained at home with our two young daughters.  But we excitedly planned to rendezvous in London, one of our favorite cities, when his conference was over.

Happily for us, Grandma agreed to stay with our daughters while I traveled to meet Herb, and on a rainy September morning I arrived in London and checked into our Bloomsbury hotel.  Soon I set off in the rain to find theater tickets for that evening, and in Leicester Square I bought half-price tickets for a comedy I knew nothing about, “Lend Me a Tenor.”  Stopping afterwards for tea at Fortnum and Mason’s eased the pain of trekking through the rain.

When Herb and I finally met up, we dined at an Italian restaurant and headed for the theater. “Lend Me a Tenor” was hilarious and set the tone for a wonderful week together.

We covered a lot of ground in London that week, including a visit to Carlyle’s house in Chelsea, a sunny boat trip to Greenwich, viewing notable Brits on the walls of the National Portrait Gallery, tramping around Bloomsbury and Hampstead, and lunching with a British lawyer (a law-school friend) at The Temple, an Inn of Court made famous by our favorite TV barrister, Rumpole (of the Bailey), whose chambers were allegedly in The Temple.

Other highlights were our evenings at the theater. Thanks to advice from my sister, who’d just been in London, we ordered tickets before leaving home for the new smash musical, “Les Miserables” (which hadn’t yet hit Broadway). It was worth every penny of the $75 we paid per ticket (a pricey sum in 1986) to see Colm Wilkinson portray Jean Valjean on the stage of the Palace Theatre.  We also loved seeing a fresh interpretation of “The Merry Wives of Windsor” at the Barbican and Alan Ayckbourn’s poignant comedy “A Chorus of Disapproval” at the Lyric.  Although “Mutiny!”–a musical based on “Mutiny on the Bounty”–was disappointing, we relished a concert at South Bank’s Royal Festival Hall, where I kept expecting the Queen to enter and unceremoniously plop herself down in one of the hall’s many boxes.

But it was our day trip to Cambridge that was the centerpiece of our week.  On Friday, September 19th, we set out by train from King’s Cross Station and arrived at Cambridge in just over an hour.  We immediately reveled in the array of beautiful sites leaping out at us on the university campus nestled along the Cam River.  Our first stop was Queens’ College and its remarkable Mathematical Bridge.  The college spans both sides of the river (students jokingly refer to the newer half as the “light side” and the older half as the “dark side”), and the world-famous bridge connects the two.  The legend goes that the bridge was designed and built by Cambridge scholar Sir Isaac Newton without the use of nuts or bolts. But in fact it was built with nuts and bolts in 1749, 22 years after Newton died, and rebuilt in 1905.

Our next must-see site was King’s College.  During my college years at Washington University in St. Louis, I learned that Graham Chapel, our strikingly beautiful chapel–built in 1909 and the site of many exhilarating lectures and concerts (in which I often sang)–shared its design with that of King’s College, Cambridge.  So we headed right for it.  (Graham Chapel’s architect never maintained that it was an exact copy but was only partly modeled after King’s College Chapel, which is far larger.)

Entering the huge and impressive Cambridge version, we were suitably awed by its magnificence.  Begun by King Henry VI in 1446, it features the largest “fan vault” in the world and astonishingly beautiful medieval stained glass.  (A fan vault? It’s a Gothic vault in which the ribs are all curved the same and spaced evenly, resembling a fan.)

As we left the chapel, still reeling from all the stunning places we’d just seen, we noticed signs pointing us in the direction of punts available for a ride on the Cam.  The idea of “punting on the Cam”—riding down the river on one of the flat-bottomed boats that have been around since 1902–sounded wonderful.  We didn’t hesitate to pay the fare and immediately seated ourselves in one of the boats.

The river was serene, with only a few other boats floating nearby, and our punter, a charming young man in a straw boater hat, provided intelligent narration as we floated past the campus buildings stretched out along the river.  He propelled the boat by pushing against the river bed with a long pole.  His charm and good looks enhanced our ride enormously.

The boat wasn’t crowded.  An older British couple sat directly across from us, and we chatted amiably about Britain and the United States, finding commonality where we could.

The sun was shining, and the 70-degree temperature was perfect.  Beautiful old trees dotted the riverbanks, providing shade as we floated by, admiring the exquisite college buildings.

What’s punting like?  Ideally, it’s a calm, soothing boat ride on a river like the Cam.  Something like riding in a gondola in Venice, except that gondolas are propelled by oars instead of poles. (I rush to add that the gondola I rode in Venice had a much less attractive and charming oarsman.)

An article in the Wall Street Journal in November described recent problems caused by punting’s growing popularity.  Increased congestion in the Cam has led to safety rules and regulations never needed in the past.  According to the Journal, “punt wars” have divided the city of Cambridge, with traditional boats required to follow the new rules while upstart self-hire boats, which have created most of the problems, are not.

But luckily for Herb and me, problems like those didn’t exist in 1986.  Not at all.  Back then, floating along the river with my adored husband by my side was an idyllic experience that has a special place in my memory.

I don’t recall where I bought my leather key ring.  Perhaps in a small shop somewhere in Cambridge.  But no matter where I bought it, it remains a happy reminder of a truly extraordinary day.

 

Hamilton, Hamilton…Who Was He Anyway?

Broadway megahit “Hamilton” has brought the Founding Parent (okay, Founding Father) into a spotlight unknown since his own era.

Let’s face it.  The Ron Chernow biography, turned into a smash Broadway musical by Lin-Manuel Miranda, has made Alexander Hamilton into the icon he hasn’t been–or maybe never was–in a century or two. Just this week, the hip-hop musical “Hamilton” received a record-breaking 16 Tony Award nominations.

His new-found celebrity has even influenced his modern-day successor, current Treasury Secretary Jack Lew, leading Lew to reverse his earlier plan to remove Hamilton from the $10 bill and replace him with the image of an American woman.

Instead, Hamilton will remain on the front of that bill, with a group representing suffragette leaders in 1913 appearing on the back, while Harriet Tubman will replace no-longer-revered and now-reviled President Andrew Jackson on the front of the $20 bill.  We’ll see other changes to our paper currency during the next five years.

But an intriguing question remains:  How many Americans—putting aside those caught up in the frenzy on Broadway, where theatergoers are forking over $300 and $400 to see “Hamilton” on stage—know who Hamilton really was?

A recent study done by memory researchers at Washington University in St. Louis has confirmed that most Americans are confident that Hamilton was once president of the United States.

According to Henry L. Roediger III, a human memory expert at Wash U, “Our findings from a recent survey suggest that about 71 percent of Americans are fairly certain that [Hamilton] is among our nation’s past presidents.  I had predicted that Benjamin Franklin would be the person most falsely recognized as a president, but Hamilton beat him by a mile.”

Roediger (whose official academic title is the James S. McDonnell Distinguished University Professor in Arts & Sciences) has been testing undergrad college students since 1973, when he first administered a test while he was himself a psychology grad student at Yale. His 2014 study, published in the journal Science, suggested that we as a nation do fairly well at naming the first few and the last few presidents.  But less than 20 percent can remember more than the last 8 or 9 presidents in order.

Roediger’s more recent study is a bit different because its goal was to gauge how well Americans simply recognize the names of past presidents.  Name-recognition should be much less difficult than recalling names from memory and listing them on a blank sheet of paper, which was the challenge in 2014.

The 2016 study, published in February in the journal Psychological Science, asked participants to identify past presidents, using a list of names that included actual presidents as well as famous non-presidents like Hamilton and Franklin.  Other familiar names from U.S. history, and non-famous but common names, were also included.

Participants were asked to indicate their level of certainty on a scale from zero to 100, where 100 was absolutely certain.

What happened?  The rate for correctly recognizing the names of past presidents was 88 percent overall, although laggards Franklin Pierce and Chester Arthur rated less than 60 percent.

Hamilton was more frequently identified as president (with 71 percent thinking that he was) than several actual presidents, and people were very confident (83 on the 100-point scale) that he had been president.

More than a quarter of the participants incorrectly recognized others, notably Franklin, Hubert Humphrey, and John Calhoun, as past presidents.  Roediger thinks that probably happened because people are aware that these were important figures in American history without really knowing what their actual roles were.

Roediger and his co-author, K. Andrew DeSoto, suggest that our ability to recognize the names of famous people hinges on their names appearing in a context related to the source of their fame.  “Elvis Presley was famous, but he would never be recognized as a past president,” Roediger says.   It’s not enough to have a familiar name.  It must be “a familiar name in the right context.”

This study is part of an emerging line of research focusing on how people remember history.  The recent studies reveal that the ability to remember the names of presidents follows consistent and reliable patterns.  “No matter how we test it—in the same experiment, with different people, across generations, in the laboratory, with online studies, with different types of tests—there are clear patterns in how the presidents are remembered and how they are forgotten,” DeSoto says.

While decades-old theories about memory can explain the results to some extent, these findings are sparking new ideas about fame and just how human memory-function treats those who achieve it.

As Roediger notes, “knowledge of American presidents is imperfect….”  False fame can arise from “contextual familiarity.”  And “even the most famous person in America may be forgotten in as short a time as 50 years.”

So…how will Alexander Hamilton’s new-found celebrity hold up?  Judging from the astounding success of the hip-hop musical focusing on him and his cohorts, one can predict with some confidence that his memory will endure far longer than it otherwise might have.

This time, he may even be remembered as our first Secretary of the Treasury, not as the president he never was.