Tag Archives: health

Eating Dessert Can Help You Eat Better? Seriously?

I just celebrated my birthday with a scrumptious meal at a charming San Francisco restaurant. Sharing a fabulous candle-topped dessert with my companion was a slam-dunk way to end a perfect meal in a splendid restaurant.

Should I regret consuming that delicious dessert?

The answer, happily, is no.  I should have no regrets about eating my birthday surprise, and a recent study backs me up.

According to this study, published in the Journal of Experimental Psychology: Applied and reported in a recent issue of TIME magazine, having an occasional dessert may actually be a useful tool to help you eat better.

Here’s what happened:  More than 130 university students and staff were offered a choice of two desserts and asked to make their choice at the start of the lunch line in a campus cafeteria.  The study found that those who made the “decadent” selection—lemon cheesecake—chose healthier meals and consumed fewer calories overall than those who picked fresh fruit.  Simply selecting it first was enough to influence the rest of their order.

Almost 70 percent of those who picked the cheesecake went on to choose a healthier main dish and side dish, while only about a third of those selecting fruit made the healthier choice.  The cheesecake-choosers also ate about 250 fewer total calories during their meal compared with the fruit-choosers.

Study co-author Martin Reimann, an assistant professor of marketing and cognitive science at the University of Arizona, concluded that choosing something healthy first can give us a “license” to choose something less healthy later.  But if you turn that notion around and choose something more “decadent” early on, “then this license [to choose high-calorie food] has already expired.”  In other words, making a calorie-laden choice at the beginning of the meal seems to steer people toward healthier choices later.

No one is suggesting that we all indulge in dessert on an everyday basis.  For many of us, the pursuit of good health leads us to avoid sugary desserts and choose fresh fruit instead.  But Reimann believes that choosing dessert strategically can pay off.  He advises us to be “mindful and conscious about the different choices you make.”

Will I order lemon cheesecake, a chocolate brownie, or a spectacular ice-cream concoction for dessert at my next meal?  Probably not.  But I am going to keep the Arizona research in mind.

You should, too.  Beginning your meal with the knowledge that it could end with a calorie-laden dessert just might prompt you to select a super-healthy salad for your entrée, adding crunchy green veggies on the side.

 

Giving Thanks

As our country celebrates Thanksgiving, this is the perfect time for each of us to give thanks for the many wonderful people in our lives.

I’m an ardent fan of a quote by Marcel Proust that sums up my thinking:

“Let us be grateful to people who make us happy; they are the charming gardeners who make our souls blossom.”

I’ve always been a fan of giving thanks.  I raised my children to give thanks to others for whatever gifts or help they received, bolstering my words by reading and re-reading to them Richard Scarry’s “The Please and Thank You Book.”

But guess what.  Not everyone agrees with that sentiment.  These nay-sayers prefer to ignore the concept of gratitude.  They reject the idea of thanking others for anything, including any and all attempts to make them happy.

What dolts!

Recent research confirms my point of view.

According to a story in The New York Times earlier this year, new research revealed that people really like getting thank-you notes.  Two psychologists wanted to find out why so few people actually send these notes.  The 100 or so participants in their study were asked to write a short “gratitude letter” to someone who had helped them in some way.  It took most subjects less than five minutes to write these notes.

Although the notes’ senders typically guessed that their notes would evoke nothing more than 3 out of 5 on a happiness rating, the result was very different.  After receiving the thank-you notes, the recipients told them how happy they were to get them:  many said they were “ecstatic,” scoring 4 out of 5 on the happiness rating.

Conclusion?  People tend to undervalue the positive effect they can have on others, even with a tiny investment of time. The study was published in June 2018 in the journal Psychological Science.

A vast amount of psychological research affirms the value of gratitude.

I’ll begin with its positive effect on physical health.  According to a 2012 study published in Personality and Individual Differences, grateful people experience fewer aches and pains and report feeling healthier than other people.

Gratitude also improves psychological health, reducing a multitude of toxic emotions, from envy and resentment to frustration and regret.  A leading gratitude researcher, Robert Emmons, has conducted a number of studies on the link between gratitude and well-being, confirming that gratitude increases happiness and reduces depression.

Other positive benefits:  gratitude enhances empathy and reduces aggression (a 2012 study by the University of Kentucky), it improves sleep (a 2011 study in Applied Psychology: Health and Well-Being), and it improves self-esteem (a 2014 study in the Journal of Applied Sport Psychology).  The list goes on and on.

So, during this Thanksgiving week, let’s keep in mind the host of studies that have demonstrated the enormously positive role gratitude plays in our daily lives.

It’s true that some of us are luckier than others, leading lives that are filled with what might be called “blessings” while others have less to be grateful for.

For those of us who have much to be thankful for, let’s be especially grateful for all of the “charming gardeners who make our souls blossom,” those who bring happiness to our remarkably fortunate lives.

And let’s work towards a day when the less fortunate in our world can join us in our much more gratitude-worthy place on this planet.

 

Let’s keep going as long as we can

One thing everyone can agree on:  Every single day, we’re all getting older.

But we don’t have to let that indisputable fact stop us from doing what we want to do.

I just came across a spectacular example of a 96-year-old scientist who keeps on going and going and going….

By sheer coincidence, he’s a man who’s worked for decades in the field of battery speed and capacity.  And he’s very much more than good enough to serve as an astounding example of enduring optimism and hard work.

A Wall Street Journal story in August profiled John Goodenough, who helped invent the lithium-ion battery that’s used to recharge cell phones and a host of other electronic products.  By introducing lithium cobalt oxide to the inner workings of batteries in 1980, he made batteries not only more powerful but also more portable.

At age 96, he now wants to kill off his own creation by removing the cobalt that allowed his battery to charge faster and last longer.  In April 2018, he and three co-authors published research that may lead to a new battery that’s liquid-free and cobalt-free.

Initial research shows that the new battery could potentially double the energy density of the lithium-ion battery.  That would mean that an electric car, for example, could drive twice as far on one charge.

“My mission is to try to see if I can transform the battery world before I die,” Dr. Goodenough says.  He added that he has no plans to retire.  “When I’m no longer able to drive and I’m forced to go into a nursing home, then I suppose I will be retiring.”

Goodenough works in an untidy office at the University of Texas in Austin, where he’s a professor of engineering.  He begins work between 8 and 8:30 a.m., leaves around 6 p.m., and works from home throughout the weekend.

He hand-writes his research and doesn’t own a cell phone, rejecting the mobile technology that his batteries made possible.  His car is a 10-year-old Honda that he hopes will last as long as he does.

His motivation is to help electric cars wean society off its dependence on the combustion engine, like the one in his Honda.

“He is driven by scientific curiosity, and he really wants to do something for society with the science he does,” says one of his colleagues, another engineering professor at UT, Arumugam Manthiram.

Isn’t it heartening to come across someone like John Goodenough, a remarkable human being who refuses to quit?

His story energizes me.  Although I’m considerably younger than Goodenough, it encourages me to pursue my passions no matter how old I get.

Does his story energize you, too?

 

[This blog post is somewhat shorter than usual because I’m currently in the midst of publishing my third novel, RED DIANA.  I’m hoping it will be available soon at bookstores everywhere and on Amazon.com.]

 

Sunscreen–and a father who cared

August is on its last legs, but the sun’s rays are still potent. Potent enough to require that we use sunscreen. Especially those of us whose skin is most vulnerable to those rays.

I’ve been vulnerable to the harsh effects of the sun since birth.  And I now apply sunscreen religiously to my face, hands, and arms whenever I expect to encounter sunlight.

When I was younger, sunscreen wasn’t really around.  Fortunately for my skin, I spent most of my childhood and youth in cold-weather climates where the sun was absent much of the year.  Chicago and Boston, even St. Louis, had long winters featuring gray skies instead of sunshine.

I encountered the sun mostly during summers and a seven-month stay in Los Angeles.  But my sun exposure was limited.  It was only when I was about 28 and about to embark on a trip to Mexico that I first heard of “sunblock.”  Friends advised me to seek it out at the only location where it was known to be available, a small pharmacy in downtown Chicago.   I hastened to make my way there and buy a tube of the pasty white stuff, and once I hit the Mexican sun, I applied it to my skin, sparing myself a wretched sunburn.

The pasty white stuff was a powerful reminder of my father.  Before he died when I was 12, Daddy would cover my skin with something he called zinc oxide.

Daddy was a pharmacist by training, earning a degree in pharmacy from the University of Illinois at the age of 21.  One of my favorite family photos shows Daddy in a chemistry lab at the university, learning what he needed to know to earn that degree.  His first choice was to become a doctor, but because his own father had died during Daddy’s infancy, there was no way he could afford medical school.  An irascible uncle was a pharmacist and somehow pushed Daddy into pharmacy as a less expensive route to helping people via medicine.

Daddy spent years bouncing between pharmacy and retailing, and sometimes he did both.  I treasure a photo of him as a young man standing in front of the drug store he owned on the South Side of Chicago.  When I was growing up, he sometimes worked at a pharmacy and sometimes in other retailing enterprises, but he never abandoned his knowledge of pharmaceuticals.  While working as a pharmacist, he would often bring home new drugs he believed would cure our problems.  One time I especially recall:  Because as a young child I suffered from allergies, Daddy was excited when a brand-new drug came along to help me deal with them, and he brought a bottle of it home for me.

As for preventing sunburn, Daddy would many times take a tube of zinc oxide and apply it to my skin.

One summer or two, I didn’t totally escape a couple of bad sunburns. Daddy must have been distracted just then, and I foolishly exposed my skin to the sun.  He later applied a greasy ointment called butesin picrate to soothe my burn. But I distinctly remember that he used his knowledge of chemistry to get out that tube of zinc oxide whenever he could.

After my pivotal trip to Mexico, sunblocks became much more available.  (I also acquired a number of sunhats to shield my face from the sun.)  But looking back, I wonder about the composition of some of the sunblocks I applied to my skin for decades.  Exactly what was I adding to my chemical burden?

In 2013, the FDA banned the use of the word “sunblock,” stating that it could mislead consumers into thinking that a product was more effective than it really was.  So sunblocks have become sunscreens, but some are more powerful than others.

A compelling reason to use powerful sunscreens?  The ozone layer that protected us in the past has undergone damage in recent years, and there’s scientific concern that more of the sun’s dangerous rays can penetrate that layer, leading to increased damage to our skin.

In recent years, I’ve paid a lot of attention to what’s in the sunscreens I choose.  Some of the chemicals in available sunscreens are now condemned by groups like the Environmental Working Group (EWG) as either ineffective or hazardous to your health. (Please check EWG’s 2018 Sunscreen Guide for well-researched and detailed information regarding sunscreens.)

Let’s note, too, that the state of Hawaii has banned the future use of sunscreens that include one of these chemicals, oxybenzone, because it washes off swimmers’ skin into ocean waters and has been shown to be harmful to coral reefs.  If it’s harming coral, what is it doing to us?

Because I now make the very deliberate choice to avoid using sunscreens harboring suspect chemicals, I use only those sunscreens whose active ingredients include—guess what– zinc oxide.   Sometimes another safe ingredient, titanium dioxide, is added.  The science behind these two mineral (rather than chemical) ingredients?   Both have inorganic particulates that reflect, scatter, and absorb damaging UVA and UVB rays.

Daddy, I think you’d be happy to know that science has acknowledged what you knew all those years ago.  Pasty white zinc oxide still stands tall as one of the very best barriers to repel the sun’s damaging rays.

In a lifetime filled with many setbacks, both physical and professional, my father always took joy in his family.  He showered us with his love, demonstrating that he cared for us in innumerable ways.

Every time I apply a sunscreen based on zinc oxide, I think of you, Daddy.  With love, with respect for your vast knowledge, and with gratitude that you cared so much for us and did everything you could to help us live a healthier life.

 

Who the Heck Knows?

I have a new catch phrase:  “Who the heck knows?”

I started using it last fall, and ever since then I’ve found that it applies to almost everything that might arise in the future.

I don’t claim originality, but here’s how I came up with it:

At a class reunion in October, I was asked to be part of a panel of law school classmates who had veered off the usual lawyer-track and now worked in a totally different area.

Specifically, I was asked to address a simple question:  Why did I leave my work as a lawyer/law professor and decide to focus primarily on writing?

First, I explained that I’d always loved writing, continued to write even while I worked as a lawyer, and left my law-related jobs when they no longer seemed meaningful.  I added that my move to San Francisco led to launching my blog and publishing my first two novels.

I concluded:

“If I stay healthy and my brain keeps functioning, I want to continue to write, with an increasing focus on memoirs….  I’ll keep putting a lot of this kind of stuff on my blog.  And maybe it will turn into a book or books someday.

“Who the heck knows?”

 

After I said all that, I realized that my final sentence was the perfect way to respond to almost any question about the future.

Here’s why it seems to me to apply to almost everything:

None of us knows what the next day will bring.  Still, we think about it.

In “Men Explain Things to Me,” the author Rebecca Solnit notes “that we don’t know what will happen next, and the unlikely and the unimaginable transpire quite regularly.”  She finds uncertainty hopeful, while viewing despair as “a form of certainty,” certainty that that “the future will be a lot like the present or will decline from it.”

Let’s cast certainty aside and agree, with Solnit, that uncertainty is hopeful.  Let’s go on to question what might happen in the uncertain future.

For example:

We wonder whether the midterm elections will change anything.

We wonder whether our kids will choose to follow our career choices or do something totally different.

We wonder whether our family history of a deadly disease will lead to having it ourselves.

We wonder whether to plan a trip to Peru.

We wonder whether we’re saving enough money for retirement.

We wonder how the U.S. Supreme Court will rule in an upcoming case.

We wonder what our hair will look like ten years from now.

We wonder what the weather will be like next week.

And we wonder what the current occupant of the White House will say or do regarding just about anything.

 

You may have an answer in mind, one that’s based on reason or knowledge or probability.   But if you’re uncertain…in almost every case, the best response is:  Who the heck knows?

If you’re stating this response to others, I suggest using “heck” instead of a word that might offend anyone.  It also lends a less serious tone to all of the unknowns out there, some of which are undoubtedly scary.

If you prefer to use a more serious tone, you can of course phrase things differently.

But I think I’ll stick with “Who the heck knows?”

Warning:  If you spend any time with me, you’ll probably hear me say it, again and again.

But then, who the heck knows?

Pockets!

Women’s clothes should all have pockets. 

(A bit later in this post, I’ll explain why.)

I admit it.  I’m a pocket-freak.

When I shop for new pants, I don’t bother buying new pants, no matter how appealing, if they don’t have pockets.  Why?

Because when I formerly bought pants that didn’t have pockets, I discovered over time that I never wore them. They languished forever in a shameful pile of unworn clothes.

It became clear that I liked the benefits of wearing pants with pockets.  Why then would I buy new pants without pockets when those I already had were languishing unworn?

Result:  I simply don’t buy no-pocket pants anymore

Most jeans have pockets, often multiple pockets, and I like wearing them for that reason, among others.  (Please see “They’re My Blue Jeans, and I’ll Wear Them If I Want To,” published in this blog in May 2017.)

Most jackets, but not all, have pockets.  Why not?  They all need pockets.  How useful is a jacket if it doesn’t have even one pocket to stash your stuff?

Dresses and skirts should also have pockets.  Maybe an occasional event, like a fancy gala, seems to require a form-fitting dress that doesn’t have pockets.  But how many women actually go to galas like that?  Looking back over my lifetime of clothes-wearing, I can think of very few occasions when I had to wear a no-pocket dress.  As for skirts, I lump them in the same category as pants.  Unless you feel compelled for some bizarre reason to wear a skin-tight pencil skirt, what good is a skirt without pockets?

Cardigan sweaters, like jackets, should also have pockets.  So should robes.  Pajamas. Even nightgowns.  I wear nightgowns, and I relish being able to stick something like a facial tissue into the pocket of my nightgown!   You never know when you’re going to sneeze, right?

Did you ever watch a TV program called “Project Runway?”  It features largely unknown fashion designers competing for approval from judges, primarily high-profile insiders in the fashion industry.  Here’s what I’ve noticed when I’ve watched an occasional episode:  Whenever a competing designer puts pockets in her or his designs, the judges enthusiastically applaud that design.  They clearly recognize the value of pockets and the desire by women to wear clothes that include them.

(By the way, fake pockets are an abomination.  Why do designers think it’s a good idea to put a fake pocket on their designs?  Sewing what looks like a pocket but isn’t a real pocket adds insult to injury.  Either put a real pocket there, or forget the whole thing.  Fake pockets?  Boo!)

Despite the longing for pockets by women like me, it can be challenging to find women’s clothes with pockets.  Why?

Several women writers have speculated about this challenge, generally railing against sexist attitudes that have led to no-pocket clothing for women.

Those who’ve traced the evolution of pockets throughout history discovered that neither men nor women wore clothing with pockets until the 17th century.  Pockets in menswear began appearing in the late 1600s.  But women?  To carry anything, they were forced to wrap a sack with a string worn around their waists and tuck the sack under their petticoats.

These sacks eventually evolved into small purses called reticules that women would carry in their hands.  But reticules were so small that they limited what women could carry.  As the twentieth century loomed, women rebelled.  According to London’s Victoria and Albert Museum, dress patterns started to include instructions for sewing pockets into skirts.  And when women began wearing pants, they would finally have pockets.

But things soon switched back to no-pocket pants.  The fashion industry wasn’t a big fan of pockets, insisting on featuring “slimming” designs for women, while men’s clothes still had scads of pockets.  The result has been the rise of bigger and bigger handbags (interestingly, handbags are often called “pocketbooks” on the East Coast).

Enormous handbags create a tremendous burden for women.  Their size and weight can literally weigh a woman down, impeding her ability to move through her busy life the way men can.  (I’ve eschewed bulky handbags, often wearing a backpack instead.  Unfortunately, backpacks are not always appropriate in a particular setting.)

Today, many women are demanding pockets.  Some have advocated pockets with the specific goal of enabling women to carry their iPhones or other cell phones that way.  I’m a pocket-freak, but according to recent scientific research, cell phones emit dangerous radiation, and this kind of radiation exposure is a major risk to your health.  Some experts in the field have therefore advised against keeping a cell phone adjacent to your body.  In December 2017, the California Department of Public Health specifically warned against keeping a cell phone in your pocket.  So, in my view, advocating pockets for that reason is not a good idea.

We need pockets in our clothes for a much more important and fundamental reasonFreedom.

Pockets give women the kind of freedom men have:  The freedom to carry possessions close to their bodies, allowing them to reach for essentials like keys without fumbling through a clumsy handbag.

I propose a boycott on no-pocket clothes.  If enough women boycott no-pocket pants, for example, designers and manufacturers will have to pay attention.  Their new clothing lines will undoubtedly include more pockets.

I hereby pledge not to purchase any clothes without pockets.

Will you join me?

 

 

Of Mice and Chocolate (with apologies to John Steinbeck)

Have you ever struggled with your weight?  If you have, here’s another question:  How’s your sense of smell?

Get ready for some startling news.  A study by researchers at UC Berkeley recently found that one’s sense of smell can influence an important decision by the brain:  whether to burn fat or to store it.

In other words, just smelling food could cause you to gain weight.

But hold on.  The researchers didn’t study humans.  They studied mice.

The researchers, Andrew Dillin and Celine Riera, studied three groups of mice.  They categorized the mice as “normal” mice, “super-smellers,” and those without any sense of smell.  Dillin and Riera found a direct correlation between the ability to smell and how much weight the mice gained from a high-fat diet.

Each mouse ate the same amount of food, but the super-smellers gained the most weight.

The normal mice gained some weight, too.  But the mice who couldn’t smell anything gained very little.

The study, published in the journal Cell Metabolism in July 2017 was reported in the San Francisco Chronicle.  It concluded that outside influences, like smell, can affect the brain’s functions that relate to appetite and metabolism.

According to the researchers, extrapolating their results to humans is possible.  People who are obese could have their sense of smell wiped out or temporarily reduced to help them control cravings and burn calories and fat faster.  But Dillin and Riera warned about risks.

People who lose their sense of smell “can get depressed” because they lose the pleasure of eating, Riera said.  Even the mice who lost their sense of smell had a stress response that could lead to a heart attack.  So eliminating a human’s sense of smell would be a radical step, said Dillin.  But for those who are considering surgery to deal with obesity, it might be an option.

Here comes another mighty mouse study to save the day.  Maybe it offers an even better way to deal with being overweight.

This study, published in the journal Cell Reports in September 2017, also focused on creating more effective treatments for obesity and diabetes.  A team of researchers at the Washington University School of Medicine in St. Louis found a way to convert bad white fact into good brown fat—in mice.

Researcher Irfan J. Lodhi noted that by targeting a protein in white fat, we can convert bad fat into a type of fat (beige fat) that fights obesity.  Beige fat (yes, beige fat) was discovered in adult humans in 2015.  It functions more like brown fat, which burns calories, and can therefore protect against obesity.

When Lodhi’s team blocked a protein called PexRAP, the mice were able to convert white fat into beige fat.  If this protein could be blocked safely in white fat cells in humans, people might have an easier time losing weight.

Just when we learned about these new efforts to fight obesity, the high-fat world came out with some news of its own.  A Swiss chocolate manufacturer, Barry Callebaut, unveiled a new kind of chocolate it calls “ruby chocolate.”  The company said its new product offers “a totally new taste experience…a tension between berry-fruitiness and luscious smoothness.”

The “ruby bean,” grown in countries like Ecuador, Brazil, and Ivory Coast, apparently comes from the same species of cacao plant found in other chocolates.  But the Swiss company claims that ruby chocolate has a special mix of compounds that lend it a distinctive pink hue and fruity taste.

A company officer told The New York Times that “hedonistic indulgence” is a consumer need and that ruby chocolate addresses that need, more than any other kind of chocolate, because it’s so flavorful and exciting.

So let’s sum up:  Medical researchers are exploring whether the scent of chocolate or any other high-fat food might cause weight-gain (at least for those of us who are “super-smellers”), and whether high-fat food like chocolate could possibly lead to white fat cells “going beige.”

In light of these efforts by medical researchers, shouldn’t we ask ourselves this question:  Do we really need another kind of chocolate?

Feeling Lazy? Blame Evolution

I’m kind of lazy.  I admit it. I like to walk, ride a bike, and splash around in a pool, but I don’t indulge in a lot of exercise beyond that.

Now a Harvard professor named Daniel Lieberman says I can blame human evolution.  In a recent paper, “Is Exercise Really Medicine? An Evolutionary Perspective,” he explains his ideas.

First, he says (and this is the sentence I really like), “It is natural and normal to be physically lazy.”  Why?  Because human evolution has led us to exercise only as much as we must to survive.

We all know that our ancestors lived as hunter-gatherers and that food was often scarce.  Lieberman adds this idea:  Resting was key to conserving energy for survival and reproduction.  “In other words, humans were born to run—but as little as possible.”

As he points out, “No hunter-gatherer goes out for a jog, just for the sake of it….”  Thus, we evolved “to require stimuli from physical activity.”  For example, muscles become bigger and more powerful with use, and they atrophy when they’re not used.  In the human circulatory system, “vigorous activity stimulates expansion of …circulation,” improves the heart’s ability to pump blood, and increases the elasticity of arteries.  But with less exercise, arteries stiffen, the heart pumps less blood, and metabolism slows.

Lieberman emphasizes that this entire process evolved to conserve energy whenever possible.  Muscles use a lot of calories, making them costly to maintain.  Muscle wasting thus evolved as a way to lower energy consumption when physical activity wasn’t required.

What about now?  Until recently, it was never possible in human history to lead an existence devoid of activity.  The result:  According to Lieberman, the mechanisms humans have always used to reduce energy expenditures in the absence of physical activity now manifest as diseases.

So maladies like heart disease, diabetes, and osteoporosis are now the consequences of adaptations that evolved to trim energy demand, and modern medicine is now stuck with treating the symptoms.

In the past, hunter-gatherers had to exercise because if they didn’t, they had nothing to eat.  Securing food was an enormous incentive.  But today, for most humans there are very few incentives to exercise.

How do we change that?  Although there’s “no silver bullet,” Lieberman thinks we can try to make activity “more fun for more people.”  Maybe making exercise more “social” would help.  Community sports like soccer teams and fun-runs might encourage more people to get active.

Lieberman has another suggestion.  At his own university, students are no longer required to take physical education as part of the curriculum.  Harvard voted its physical-education requirement out of existence in the 1970s, and he thinks it’s time to reinstate it.  He notes surveys that show that very few students who are not athletes on a team get sufficient exercise.  A quarter of Harvard undergraduates have reported being sedentary.

Because “study after study shows that…people who get more physical activity have better concentration, their memories are better, they focus better,” Lieberman argues that the time spent exercising is “returned in spades…not only in the short term, but also in the long term.  Shouldn’t we care about the long-term mental and physical health of our students?”

Lieberman makes a powerful argument for reinstating phys-ed in those colleges and universities that have dropped it.  His argument also makes sense for those of us no longer in school.

Let’s foil what the millennia of evolution have done to our bodies and boost our own level of exercise as much as we can.

Tennis, anyone?

 

[Daniel Lieberman’s paper was the focus of an article in the September-October 2016 issue of Harvard Magazine.  He’s the Lerner professor of biological sciences at Harvard.]

 

Take a hike

The lure of “the gym” has always escaped me. I’ve joined a few fitness centers in my day, but I consistently end up abandoning the gym and resorting to my preferred route to fitness: walking. Whenever possible, I walk and hike in the great outdoors.

A host of recent studies has validated my faith in the benefits of walking. And some of these benefits may surprise you.

First, being active is better for your health. Duh. We’ve all suspected that for a long time. But here’s a new finding: sitting may be the real problem. Studies show that the more you sit, the greater your risk for health problems. In a study of more than two thousand adults ages 60 and older, every additional hour a day spent sitting was linked to a 50 percent greater risk of disability. Even those who got some exercise but were sitting too much were more likely to get dumped in the pool of disabled people.

Dorothy Dunlop and her colleagues at Northwestern’s McCormick School of Engineering and Applied Science concluded that sitting seems to be a separate risk factor. Getting enough exercise is important, but it’s equally important not to be a couch potato the rest of the time. Their study appeared in the Journal of Physical Activity & Health in 2014.

Another study, published in Medicine & Science in Sports & Exercise, noted something else about prolonged sitting: taking “short walking breaks” at least once an hour may lessen or even prevent some of the adverse effects, especially on the cardiovascular system. When healthy young men sat for 3 hours without moving their legs, endothelial function—the ability of blood vessels to expand and contract—dropped significantly from the very beginning. But when they broke up their sitting time with slow 5-minute walks every 30 or 60 minutes, endothelial function did not decline.

Here’s another benefit: Exercise, including walking, can keep you from feeling depressed. A British study, reported in JAMA Psychiatry, followed over 11,000 people (initially in their early 20s) for more than 25 years. It found that the more physically active they were, the less likely they were to have symptoms of depression. For example, sedentary people who started exercising 3 times a week reduced their risk of depression 5 years later by almost 20 percent. The researchers concluded that being active “can prevent and alleviate depressive symptoms in adulthood.”

Ready for one more reason to walk? A study described in The Wall Street Journal in 2014 found that walking can significantly increase creativity. This is a brand new finding. In the past, studies have shown that after exercise, people usually perform better on tests of memory and the ability to make decisions and organize thoughts. Exercise has also been linked anecdotally to creativity: writers and artists have said for centuries that their best ideas have come during a walk. But now science supports that link.

Researchers at Stanford University, led by Dr. Marily Oppezzo, decided to test the notion that walking can inspire creativity. They gathered a group of students in a deliberately unadorned room equipped with nothing more than a desk and a treadmill. The students were asked to sit and complete “tests of creativity,” like quickly coming up with alternative uses for common objects, e.g., a button. Facing a blank wall, the students then walked on the treadmill at an easy pace, repeating the creativity tests as they walked. Result: creativity increased when the students walked. Most came up with about 60 percent more “novel and appropriate” uses for the objects.

Dr. Oppezzo then tested whether these effects lingered. The students repeated the test when they sat down after their walk on the treadmill. Again, walking markedly improved their ability to generate creative ideas, even when they had stopped walking. They continued to produce more and better ideas than they had before their walk.

When Dr. Oppezzo moved the experiment outdoors, the findings surprised her. The students who walked outside did come up with more creative ideas than when they sat, either inside or outside, but walking outside did not lead to more creativity than walking inside on the treadmill. She concluded that “it’s the walking that matters.”

So a brief stroll apparently leads to greater creativity. But the reasons for it are unclear. According to Dr. Oppezzo, “It may be that walking improves mood,” and creativity blooms more easily when one is happier. The study appeared in The Journal of Experimental Psychology: Learning, Memory, and Cognition in 2014.

In truth, I don’t need these studies to convince me to keep walking. It helps that I live in San Francisco, where the climate allows me to walk outside almost every day. Walking is much more challenging when you confront the snow and ice that used to accompany my walks in and around Chicago. So I’m not surprised that walkers in colder climes often resort to exercising indoors.

It also helps that San Francisco has recently been voted the second most walkable city in America. According to Walk Score, an organization that ranks the “walkability” of 2,500 cities in the U.S., SF placed just behind New York City as the most walkable major American city.

SF’s high score is especially impressive in light of the city’s hills. Although I avoid the steepest routes, I actually welcome a slight incline because it adds to my aerobic workout. Why use a Stairmaster in a gloomy gym when I can climb uphill enveloped in sunshine and cool ocean breezes?

But whether you walk indoors or out, do remember to walk! You’ll assuredly benefit health-wise. And you just may enhance your creativity quotient. Someday you may even find yourself writing a blog like this one.