Category Archives: New York City

Hangin’ with Judge Hoffman

POST #7

This is the seventh in a series of posts that recall what it was like to clerk for Judge Julius J. Hoffman from 1967 to 1969.

The “Chicago 7” Trial

            In the spring of 1969, shortly before I was to leave my clerkship, a new case arrived in Hoffman’s chambers.  It resulted from a grand jury’s investigation into the events that had transpired in Chicago the previous summer, just before and during the Democratic National Convention in August 1968.  Eight men, later known as the “Chicago 8,” were accused of violating a new federal law, the Anti-Riot Act, by inciting demonstrations and violent encounters with the police in the streets and parks of Chicago.

            Countless books and articles have been written about this trial, sometimes called the Chicago “conspiracy trial,” and I’ve also seen a number of dramatic presentations on the stage, TV, and film. I don’t question the validity of any of these and won’t comment on them here.

            In 2020, a new film, written and directed by Aaron Sorkin, “The Trial of the Chicago 7,” appeared on our screens.  I’ll comment briefly on that film later.

            Because I clerked for Judge Julius J. Hoffman from 1967 to 1969, and because I lived in Chicago during that tumultuous time, I want to state my personal comments on the trial and the events that led up to it. 

            In this post, I’ll state my point of view as someone in a unique situation, working with Judge Hoffman at the very outset of the case, and who–after the trial was over—briefly talked to him about it.

First, some background 

I began my clerkship with Judge Hoffman during the summer of 1967, and I served as his junior clerk until the summer of 1968, when I became his senior clerk for a year.  My two-year clerkship ended during the summer of 1969.  Throughout my clerkship, I was living in my hometown of Chicago.  To help you comprehend the significance of the trial and surrounding events, I want to put them into some sort of context.

My remarks are based on my personal recollections of Chicago during those years.

            First, I was well aware of the rampant corruption in the city.  At the time, I sometimes described the city’s government as a “benevolent dictatorship.” Looking back, I no longer view it as “benevolent,” but it certainly was a dictatorship.

            The city’s government was dominated by one man:  Richard J. Daley, who served as the city’s mayor from 1955 until his death in 1976.  He not only held the executive leadership role, but he also made sure that local ordinances and everything else he wanted were enacted by a complicit city council filled with his acolytes (I recall only one dissenter during those years, an alderman named Leon Despres).  In addition, he decided who would fill local judicial openings.  As a result, the state courts were rank with incompetent judges loyal to Daley and his machine.

            Federal judgeships were somewhat different.  Many if not most of the judges were more or less independent of Daley, at least once they were on the bench.  Even though these judges probably were not totally lacking corrupt motives of their own, most of them (including Democratically-appointed judges who had some connection to Daley in their past) were not necessarily loyal to the Daley regime. 

            Judge Hoffman was a Republican appointed by Dwight Eisenhower and to my knowledge not connected in any way to the Daley machine, although I suspect that many members of the public thought he was. At a luncheon in Chicago in 2001, I was seated next to a prominent news anchor who voiced his assumption that Hoffman was part of the Daley machine.  I immediately corrected him.

Political developments during 1968

            By early 1968, many Democrats had grown restless with the presidency of LBJ.  National politics heated up when a number of men announced that they hoped to replace him.

            The heat became intense on March 31, when LBJ announced in a live TV appearance that he would not run again. (I recall watching him make that stunning announcement.) 

            LBJ reportedly dropped out because Senator Eugene McCarthy of Minnesota had won 42 percent of the vote in the New Hampshire primary, while LBJ won only 49 percent.  McCarthy was the leading voice advocating an end to the Vietnam War, a position that was becoming increasingly popular.  McCarthy’s primary showing led Senator Robert F. Kennedy of New York to enter the race a short time later.   Like McCarthy, he ran on an antiwar platform and advocated a number of other popular positions.

Even before LBJ’s announcement on March 31, I—like so many others– had become disillusioned with him, despite all of his remarkable accomplishments in the domestic realm, because of his increasing and unwavering support of the Vietnam War.  I never considered supporting Republican Richard Nixon, who had run and lost to JFK in 1960 (and later lost his race for governor of California).  I loathed Nixon for a great many reasons.   But for a while, New York’s relatively moderate Republican governor, Nelson Rockefeller, seemed like a possible alternative to LBJ. 

During a visit to New York City to visit friends in early March, I accompanied one of them to a meeting of the NYC bar association, where Rockefeller was the main speaker.  As he finished his speech and left for the exit, I ran after him.  Just outside the building, I approached him and stuck out my hand to shake his.  “Please run for president,” I urged him as we shook hands.  He smiled and said, “Well, aren’t you dear?” before descending the steps to his waiting car. 

I often wondered how things might have turned out if Rockefeller had taken the advice I offered him on those steps before Nixon’s grip on his party became too strong to overcome.  A short time later, LBJ dropped out of the running, and the Democratic race was wide open.  Rockefeller no long held any real appeal for me.  Now, with LBJ out of the picture, I would decide who, among the Democratic candidates who remained in the race, I’d support.

The Democratic National Convention, which would choose the ultimate nominee, was scheduled to be held in Chicago in late August of 1968.

My life in Chicago

I tried to keep up with political developments that spring and summer, but truthfully, I was primarily focused on the things that dominated my everyday life.  First, there were my responsibilities as Hoffman’s law clerk.  Then there was travel, like my trip to NYC in early March.  I also spent time with old friends I’d known for years along with some new ones.  And then there was my attempt to meet potential suitors.  (I was focused on my career but not to the exclusion of marriage and kids.)

In April, Chicago was rocked by the murder of Martin Luther King Jr. in Memphis.  Fires and looting broke out in parts of the city.  I recall visiting the apartment of someone I was dating at the time.  Together we stood at the windows of his high-rise apartment in Sandburg Village viewing the widespread fires we could see below.  I was immensely saddened by King’s death and the terrible destruction that followed.  But my own everyday life didn’t really change.

Was I aware of the efforts by antiwar activists who were gearing up for the DNC in August?  Just barely.  I often watched local TV news and read the daily Chicago Sun-Times, so I was vaguely aware that there was a Yippie movement headed by Abbie Hoffman and Jerry Rubin.  Didn’t they publicize a stunt where they brought a pig to Chicago, announcing that it was a candidate for president?  Because that struck me as pretty ridiculous, it was hard to take them seriously.  

I probably had read something about Tom Hayden and the SDS, but I honestly knew very little about them.  I was also vaguely aware of the Black Panthers, led in Chicago by Fred Hampton, but their agenda didn’t have any noticeable impact on my everyday life.

I did follow the campaigns of the leading Democratic hopefuls.  Eugene McCarthy was my early favorite, but by early June I was considering shifting my allegiance to RFK.  I was therefore horrified, along with the rest of the country, when he was assassinated in the Ambassador Hotel in LA that June.  I had lived very near the Ambassador when I briefly lived in LA at the age of 12, where my family’s first home was a rented apartment on Normandie just off Wilshire Boulevard, a location very close to the Ambassador, and I had strolled near there.  That memory made RFK’s assassination even more real to me than it might otherwise have. 

After he died, I returned to supporting Eugene McCarthy, but I ultimately resolved (with reservations) to support LBJ’s vice president, former Minnesota Senator Hubert Humphrey, as the most electable of the Democratic candidates for president.  Humphrey had a long and admirable record as a liberal Democrat who had supported civil rights legislation and other liberal causes for many years.  But although he earned the support of liberal senators like Fred Harris of Oklahoma and Walter Mondale of Minnesota, he faced vehement opposition because of his adherence to LBJ’s Vietnam policies.  He spoke out against Senator McCarthy and Senator George McGovern’s call for an immediate end to the bombings in Vietnam, an early withdrawal of troops, and setting talks for a coalition government with the Viet Cong. 

Humphrey didn’t enter any of the primary elections held in 13 states, but he won the party nomination at its chaotic convention in Chicago in August.  He lost the November election by less than one percent of the popular vote, but he carried only 13 states. 

The Democratic convention and the events surrounding it led to the trial of the “Chicago 7.”

My vacation that summer

I planned to take my summer vacation during the convention for a simple reason: A close friend who lived in NYC asked me to join her on a road trip that would leave Chicago just as the convention was beginning. I was living paycheck-to-paycheck (my salary was $6,000 for the year), so I jumped at the chance to get an essentially free ride to NYC. 

My friend was coming to Chicago for a wedding, and we would leave on our road trip to NYC on Sunday, August 25, just as the convention was about to begin.  I planned to see friends in NYC and Boston, and then travel from Boston to Cape Cod with another close friend.  Because I was busy making my vacation plans, I was largely insulated from news about the convention.

In Hoffman’s chambers, things at this point seemed routine.  I was largely preoccupied with ruling on the case of The Inmates of Cook County Jail, which I’ve discussed in Post # 4.  As I mentioned in that post, I left my semi-radical opinion on Hoffman’s desk on Friday afternoon the 23rd for him to read while I was away.  (I was later amazed on my return to Chicago to learn that he’d read this opinion from the bench during my vacation.)

I was living that summer at 1360 Lake Shore Drive (where the rent on my studio apartment was a whopping $140 a month).  My mother lived a few miles farther north at Lake Shore Drive and Aldine.  Leaving my mother’s apartment the Saturday night before I was to take off on my vacation, I rode on a bus that drove through Lincoln Park (the largest park on the North Side of the city) en route to my apartment.  I was startled to see masses of people gathering in the park shortly before the convention was to begin.  During the work week, I’d been similarly shocked to see U.S. Army jeeps driving up and down city streets in downtown Chicago when I walked to work in the Federal Building on Dearborn Street. 

Both unprecedented sights made me wonder exactly what might happen in the city during the convention.  But these somewhat shocking events weren’t front and center in my mind.  I remember feeling kind of glad to be leaving town and avoiding what promised to be ominous events happening in Chicago while I was away.

Did I follow the news during my road-trip vacation?  Not really.  So I was pretty much unaware of what was happening at the convention.  But once I arrived in NYC, I stayed with a friend in Greenwich Village and, on the night that became notorious, I watched the convention with her and her husband on their living-room TV.  Needless to say, I was shocked by what I saw.  And terribly embarrassed by the behavior of Chicago’s Mayor Daley, revealed for all to see on TV.  I remember watching Senator Abe Ribicoff speaking at the podium, nominating George McGovern for president, and defying Daley, whose henchman booed the U.S. senator from Connecticut.  Daley was caught on camera mouthing expletives about Ribicoff that TV wouldn’t or couldn’t describe.

At the same time, TV news coverage highlighted what was happening elsewhere in Chicago.  The convention was held at the International Amphitheater, a considerable distance from the center of the city.  But a multitude of antiwar protesters had gathered in the very large downtown park, Grant Park, located adjacent to Michigan Avenue.  These protestors created havoc as they began to move onto Michigan Avenue.  The resulting chaos, and the violent reaction to the protestors by the Chicago police department, seen across the world on TV, was later described as a “police riot.”

In NYC, I put that chaotic vision aside and went on to meet another friend in Boston.  We traveled together to Cape Cod, where we heard that Humphrey had chosen Maine Senator Edmund Muskie as his VP.  Muskie seemed like a good choice, and despite the turmoil in the Democratic Party, I was hopeful that the Humphrey-Muskie ticket could defeat Tricky Dick.

Flying back to Chicago to resume my life there, I discovered that things had largely settled down.  What had happened during the convention didn’t loom large in my mind as I began to pay close attention to the much more compelling 1968 election campaign.  The outcome would steer our country down Humphrey’s path or down a very different one.

In November, I was plunged into gloom by the dispiriting Nixon victory.  I remember watching election-night news coverage in agony as Tricky Dick’s votes added up.  I saw his victory unfold on my tiny black-and-white TV, seated on my sofa next to a date who’d asked me to accompany him earlier that evening to a performance of “Jacques Brel Is Alive and Well and Living in Paris” at The Happy Medium.  Maybe because I associated the guy with that terrible night, I was OK when he gradually faded from my life.  I honestly didn’t care if I never saw him again.

Nixon’s victory changed everything. 

To be continued….

RBG in ’72

Countless words have been, and will continue to be, written about the incomparable U.S. Supreme Court Justice Ruth Bader Ginsburg, who served on the high court for 27 years.

I will leave discussions of her tenure on the Court to others.

What I will do here is recount the one and only time I encountered her in person, at a law school conference, at a pivotal point in her career.  If you’re interested in learning about that encounter, please read on.

In September of 1972, I was a full-time faculty member at the University of Michigan (UM) Law School.  Notably, I was the only full-time faculty member who was a woman.

The law school had a desirable setting on the UM campus, whose multitude of elm trees were unfortunately denuded of leaves, thanks to Dutch elm disease. The law school buildings made up the stunning Law Quadrangle, featuring beautiful old buildings constructed in the English Gothic style.

My role on the faculty was to help first-year law students learn the basics of legal education:  how to analyze court rulings (the kind they would read in the books assigned to them in courses like Torts and Contracts); how to do their own research into case law; and how to write a readable legal document, like an appellate brief aimed at persuading an appellate court to decide in their favor.

I was one of four young lawyers hired to fill this role.  The three men and I each taught one-fourth of the first-year class.  As I recall, we got to choose our offices in the law school library, and I immediately chose a plum.  It was an enormous wood-paneled room with charming hand-blown stained glass windows.  One entered it via a stairway leading upstairs from the library’s impressive reading room.  I treasured my office and happily welcomed meeting with students there.  And I wonder, in light of renovations at the law school, whether that glorious office still exists.

At some point early that fall, I learned that a conference on “women and the law” would be held at the New York University School of Law in October.  This was a bold new area of law that most law schools didn’t consider worth their attention.  NYU was clearly an exception. 

The idea of the conference immediately grabbed my attention because I had a longstanding interest in its stated focus.  One reason why I had myself attended law school a few years before was that, beginning very early in my life, I was and remain concerned with achieving equity and justice, including equal rights for women.

This focus had led me to attend law school during the mid-’60s.  My first job was that of law clerk to a U.S. district judge in Chicago.  After finishing my clerkship, I became a practicing lawyer as a Reggie assigned to my first choice, the Appellate and Test Case Division of the Chicago Legal Aid Bureau.  [I discussed the Reggie program in a blog post, “The Summer of ’69,” published on August 7, 2015.]

And so, three years earlier, in October of 1969, I had begun working on a lawsuit that had a significant bearing on women’s rights because it would challenge the constitutionality of Illinois’s restrictive abortion law. This law had an enormous impact on the lives of women, especially poor and non-white women.

I worked with Sybille Fritzsche, a lawyer with the ACLU in Chicago, who became my close friend.  Sybille and I spent months preparing our case.  We filed our lawsuit in February 1970, argued it before a three-judge federal court in September, and won a 2-to-1 ruling in our favor in January 1971.  (The ruling in that case, Doe v. Scott, and the events leading up to it, are the focus of a book I’m currently writing.  In the meantime, you can read about our case in historian Leslie Reagan’s prize-winning book, When Abortion Was a Crime.)

Now, in the fall of 1972, I learned about the conference at NYU.  Because I was extremely interested in attending it, I decided to ask the UM law school’s dean, Theodore St. Antoine, whether the school might send me to New York to attend it.  I thought I had a pretty persuasive argument:  I was the only full-time woman on the law school faculty.  Didn’t the dean think it would be a good idea to send me to represent UM at the conference? 

How could he say “no”?  Ted thought about for a moment, then gave his approval.  So off I went, my expenses paid by the kind patrons of UM. 

My hotel, the Fifth Avenue Hotel, located near NYU’s law school, had sounded appealing on paper, but it turned out to be something of a dump.  It suited me just fine, however, because I barely spent any time there.  I was too busy attending the conference sessions and, when I could, taking a short break to reconnect with a couple of law-school classmates and briefly sample life in New York City, a city light-years removed from less-than-exhilarating Ann Arbor, Michigan.

The conference, held on October 20-21, turned out to be a symposium sponsored by AALS (the American Association of Law Schools), “The AALS Symposium on the Law School Curriculum and the Legal Rights of Women.”  It featured a number of prominent speakers, mostly law professors and practicing lawyers who had turned their attention to “the legal rights of women” in areas like tax law, property law, and criminal law.  I attended most of these sessions, and each of them was excellent.

But the only session I was really excited about was a talk by someone named Ruth Bader Ginsburg.  I was quite certain that I would relish hearing her talk, “Toward Elimination of Sex-Based Discrimination: Constitutional Aspects,” because the topic was right down my alley.

Looking back, I don’t think I knew anything about RBG at the time.  But when she was introduced (by NYU dean Robert McKay) and began to speak, I was riveted by every word she uttered.  She spelled out everything she had already done and planned to do to achieve gender-equity.

So although I was not already familiar with her, I knew immediately that she clearly was and would continue to be a brilliant leader in the field of women’s rights.  I filed her name away in my memory so I could follow whatever she would do in the coming years.  And I did just that, enthusiastically following the many astounding accomplishments she achieved after 1972.

Your image of RBG may be that of the frail, petite woman who took center stage in our culture in her 80s.  But the RBG I saw in 1972 was very different.  She was an amazingly attractive young woman of 39.  You can see photos of her at that time in The New York Times of September 18 (in Linda Greenhouse’s long review of her life and career) and in a recent issue of TIME magazine (Oct. 5-12, 2020). Although much has been made of her short stature (one I share), she was so very energetic and focused that one quickly forgot how small she was.

It turned out that she had attended Harvard Law School about a decade before I did.  Like her, I’ve been called a “trailblazer” and a “pioneer,” and I also confronted gender-bias at every turn throughout my life.  My path was only a bit less rocky than hers:  My class at HLS included the whopping number of 25 women in a class of 520, while hers had only 9.

I’ve since learned that October 1972 marked a pivotal time in RBG’s career.  She had just switched her teaching position from Rutgers Law School to Columbia Law School (a considerable upgrade).  And she had just assumed another new position:  Director of the Women’s Rights Project at the ACLU, a project she had helped to found a short time before. 

So I’m left wondering…did she know about the case Sybille (an ACLU attorney in Chicago) and I brought in February 1970, a case that put a woman’s right to reproductive choice front and center?

RBG was an ardent supporter of reproductive rights during her tenure on the Supreme Court.  She discussed her views on abortion and gender equality in a 2009 New York Times interview, where she said “[t]he basic thing is that the government has no business making that choice for a woman.”

But I know that she had also stated that she wasn’t entirely happy with the way in which Roe v. Wade gave every woman in the U.S. that choice by bringing cases like Doe v. Scott in the federal courts.  She stated that she would have preferred that the argument had been made, over time, in each state’s legislature, with the right to choose being gradually adopted in that way rather than in one overriding court ruling that included every state.

Notably, on the 40th anniversary of the court’s ruling in Roe v. Wade, she criticized the decision because it terminated “a nascent democratic movement to liberalize abortion laws” that might have built “a more durable consensus” in support of abortion rights.

She had a point.  A democratic movement to liberalize abortion laws would have been the ideal route, and might have been a less contentious route, to achieving abortion rights throughout the country. 

But I think her position was influenced by her own life story. 

It stemmed, at least in part, from the fact that in April 1970, she was living and working in New York, where the state legislature had passed a new law allowing abortion, and New York Governor Nelson Rockefeller had signed it on April 11, 1970.  New York became only the second state in the U.S. (after Hawaii) to permit abortion, and only a few other states had carved out any sort of exception to what was otherwise a nationwide ban on abortion.

RBG may have optimistically believed that other states would follow New York’s lead.  But history has proved otherwise.

If women had waited for each of the 50 states to accomplish the goal of women’s reproductive choice, I think we’d still have many states refusing to enact laws allowing choice.  In support of my view, I ask readers to consider the situation today, when some states are trying to restrict abortion so frenetically, with or without achieving a complete ban, that they’re now simply waiting for a far-right conservative Court to overturn Roe v. Wade.

Whether or not RBG was aware of what was happening in the courtrooms of Chicago in 1970, I think I could have persuaded her that Sybille and I were doing the right thing.  

By advocating that the federal district court hold that the restrictive Illinois abortion law was unconstitutional, and persuading the court to decide in our favor, we achieved our goal of saving the lives and health of countless women who would have otherwise suffered from their inability to obtain a legal and medically safe abortion.

What greater achievement on behalf of women’s rights could there have been? 

I like to think that, after hearing my argument, RBG would have approved.

Join the ranks of the scarf-wearers

I’ve been wearing scarves all my life.  In a dusty photo album filled with black-and-white snapshots, there I am at age 8, all dressed up in my winter best, going somewhere on a cold Thanksgiving Day wearing a silk scarf that wasn’t nearly warm enough.  (Please see “Coal: A Personal History,” published in this blog on January 24, 2020.)

My mother probably set the tone for my sister and me.  We adopted what we viewed as the fashionable wearing of head scarves followed by such notables as Queen Elizabeth II (who wears her Liberty silk scarves to this day, especially during her jaunts in chilly Scotland) and the very stylish Audrey Hepburn. (Please see “Audrey Hepburn and Me,” published in this blog on August 14, 2013.)

The result:  A vast collection of scarves of every description, from humble cotton squares that look like a tablecloth in an Italian restaurant (note: these were made in France!), to lovely hand-painted silk in charming pastel colors, to Hermès lookalikes purchased from vendors in New York City’s Chinatown before the authorities cracked down on illicit counterfeit-selling.

And I wear them.  Especially since I moved to breezy San Francisco, where I never leave my home without a light jacket (or cardigan sweater), a scarf in a handy pocket (and women’s clothes should all have pockets; please see “Pockets!”, published in this blog on January 25, 2018), and a sunhat to protect my skin from the California sun (even when it’s hiding behind a cloud or two).  The only exceptions:  When there’s a torrential downpour or when we’re having unusually hot weather and only the sunhat is a must.

Now I learn that my huge array of scarves may, if used properly, protect me and others from the current scourge of COVID-19.  The State of California Department of Public Health has issued guidelines stating that wearing face coverings, including scarves, may help prevent the spread of the coronavirus.  The CDC and Bay Area public health officials have given similar advice.

Following this guidance, I began wearing scarves as face coverings several days ago, and I can now pick and choose among those I like best, so long as they are substantial enough to do the job.

Of course, I don’t want to scare anyone. After all, a black scarf worn on one’s face can be intimidating.  I certainly don’t want to enter a corner grocery store looking like a miscreant about to pull a hold-up.  So I’m opting for bright colors and cheerful designs.

We’re instructed to wash one’s scarf in hot water after each wearing.  So silk is pretty much out.  Instead I’m inclined to wear cotton or cotton blends, large enough and foldable enough to cover my nose and mouth.

So before I take off for my daily stroll, my search for just the right scarf has propelled me to select one among a wide range of choices.  Shall I choose the black-and-white cotton checkered number?  How about the Vera design featuring bright green peas emerging from their pods on a bright white background?  Or shall I select one of the scarves I bought at the Museo del Prado in Madrid in 1993, eschewing the tempting jewelry reproductions offered in the gift shop in favor of the less expensive and far more practical scarves with an admittedly unique design? (I bought two, each in a different color-combination.)

I’ve worn all of these already,  and tomorrow I’ll begin dipping into my collection to find still others.

I have to confess that I’m not particularly adept at tying my scarves as tightly as I probably should.  But whenever I encounter another pedestrian on my route (and there aren’t many), we steer clear of each other, and I use my (gloved) hand to press the scarf very close to my face.  That should do it, protection-wise.

One more thing I must remember before I wrap myself in one of my scarves:  Forget about lipstick.  Absolutely no one is going to see my lips, and any lip color would probably rub off on my scarf.  Forgeddaboutit.

Please note:  By writing about my scarf-wearing, I do not mean to trivialize the seriousness of the current crisis.  I’m simply hopeful that wearing these bright scarves–and telling you about them–will help to soften the blow the virus has already dealt so many of us.

Please join me as a scarf-wearer and, with luck, we’ll all stay safe and well   Fingers crossed!

 

 

Hooray for Hollywood! Part II: I Love Your “Funny Face”

I’m continuing to focus on films that have been relevant to my life in some way.

The film I’m focusing on today is “Funny Face,” a 1957 film starring Audrey Hepburn and Fred Astaire.

I first saw this film at Radio City Music Hall during a memorable trip to Washington DC and NYC, a trip made with my high school classmates, and one that represented the height of excitement in my life at that time.

It wasn’t my first visit to NYC and Radio City.  It also wasn’t my first trip to DC.

My parents had taken my sister and me on a road trip to the East Coast during the summer of 1950, when I was barely conscious and didn’t get a great deal out of it.  I did have a few notable experiences—staying at the St. Moritz Hotel on Central Park West (how did we afford that?) and viewing some astounding sites in DC, mostly from a cab Daddy hired to show us around town. The place I remember most was an FBI museum, where I was frightened by a loud demonstration in which a gun was shot at targets to prove how the FBI dealt with crime. (Not a great choice for a young kid.)

Some other memories include our entering a DC restaurant where the tables were covered with pink “reserved” signs, and one sign was magically whisked away when we arrived.  I later learned that the restaurant used this ploy to prevent people of color from eating there.  The staff would refuse to seat them, telling them that all of the tables were reserved.  Even at a tender age, this struck me as wrong, although I was too young to fully understand the ugliness of this blatant form of discrimination, one I’d never encountered when we ate at restaurants in Chicago.

Another vivid memory:  Strolling through Central Park Zoo in NYC, I asked Daddy to buy me a balloon.  Daddy refused.  I didn’t view my request as unreasonable.  Looking around, I saw all those other kids who were holding balloons.  Why couldn’t I have one?  I was too young to grasp reality: My father was in NYC to search for a new job (which never materialized), and our family budget didn’t permit buying an overpriced balloon.  No doubt the balloon vendors catered to far more affluent families than mine.  But I remember crying my eyes out because of the balloon-deprivation, which seemed so unfair to me.

Finally, I remember viewing a film at Radio City.  It was a poor choice for a family film: “The Men,” starring Marlon Brandon as an injured war veteran.  It was a somber film, and the atmosphere was not made any cheerier by the newsreel (ubiquitous in movie theaters then), featuring the brand-new war in Korea, which had just begun in June.  The Rockettes probably did their thing, but I barely noticed them, too disturbed by the sad movie and the scary newsreel.

Fast forward a bunch of years, when I joined my high school classmates on a school-sponsored trip to DC and NYC, during which our group of rowdy teenagers disrupted life for countless locals.  Standing out in my memory is a concert held at the Pan American Union Building, a beautiful Beaux-Arts building in DC, where my silly friends and I began to stare at a mole on the back of a young woman sitting in front of us.  Our adolescent sense of humor led us to start laughing, and once we started, we of course couldn’t stop.  Other concert-goers were probably horrified.  But something else I can’t forget:  The concert included a brilliant rendition of Mussorgsky’s “Night on Bald Mountain,” a piece I’ve loved ever since.

Moving on to NYC, where we were bused to an odd assortment of sites, we finally arrived at Radio City. The film that night was one of Hollywood’s new blockbusters, “Funny Face.”  Surrounded by my friends, whispering and laughing throughout, I barely focused on the film, certainly not enough to remember it very well.  But when I recently re-watched it on TCM, I found it completely delightful.  (Thanks, TCM, for all of the classic films I’ve watched on your channel.  Please keep showing them!)

In the film, which features a number of Gershwin tunes (including “Funny Face” and “S’wonderful”), Audrey Hepburn stands out as the radiant star she had become, while (in my view) Fred Astaire recedes into the background.

The movie’s storyline focuses on a NYC-based fashion magazine like Vogue, dominated by an aggressive editor played by Kay Thompson (much like the editor played by Meryl Streep years later in “The Devil Wears Prada”).  The editor (Kay) insists on major changes at the magazine and demands that her favored photographer, played by Astaire (Fred), help her effect those changes.  (His character is based on the renowned photographer Richard Avedon.)

Their search for a new look for the magazine improbably leads them to a bookstore in Greenwich Village, where Hepburn (Audrey) is the sole salesperson, the owner being off somewhere doing his own thing.  When Kay proposes that Audrey be the new face of her fashion magazine, Audrey—garbed in neutral black and gray– ridicules the whole concept of such a publication (it features, in her words, “silly women in silly dresses”).  But when Kay’s offer includes a trip for her to Paris, Audrey decides to go along with the idea.  She’s always wanted to see Paris!

Kay, Fred, and Audrey arrive in Paris about 15 years before my own first trip there.  But when the film begins to roam through the highlights of the city, I easily recognize the many breathtaking scenes I saw for the first time in 1972, including the view from the top of the Eiffel Tower.  (I’ve luckily returned to Paris many times, and the city and all that it offers still thrill me.)

As a teenager, I had a high regard for “fashion.”  My family’s business–women’s fashion-retailing–probably had something to do with it.  Peer pressure also played a role.  Some of my classmates were obsessed with pricey clothes, like cashmere sweaters with matching skirts, and even though I wasn’t in the same income bracket, their obsession couldn’t help rubbing off on me.  At least a little.  My place in the world just then probably accounts for my somewhat detached view of Audrey as someone who spoofs the fashion industry, at least at first.

Once the story gets underway, “Funny Face” offers a wealth of imaginative episodes.  The writer, Leonard Gershe, whose writing is clever and surprisingly not extremely dated, was Oscar-nominated for best writing, story, and screenplay.  Gershe came up with a whole lot of scenes that highlighted Paris.  A special scene takes place after Audrey goes off on her own, and Fred is sent out to track her down.  He finally finds her in a small café on the Left Bank, where she launches into a stunning dance set to jazz music.  (You may already know that Audrey had a background in dance.  She studied ballet as a teenager in Amsterdam and later studied it in London.  She then began performing in West End musical theater productions and went on to star on Broadway in a non-musical performance of Gigi in 1951.  She reportedly turned down the same role in the 1958 film.)

The jazz dance scene in “Funny Face” became famous a few years ago, when Gap used a portion of it in one of its TV commercials.  (As I recall, Gap was promoting the sort of black pants Audrey danced in.)  A controversy arose during the filming of this scene in “Funny Face.”  Audrey wanted to wear black socks while director Stanley Donen insisted that she wear white ones.  In an interview Donen gave shortly before his death, he explained why. The white socks would highlight her dancing feet while black ones would fade into the background.  Donen succeeded in persuading Audrey to see things his way, and the dance scene is now film history.

Without elaborating on the plot, I’ll point out that Audrey’s storyline has an interesting focus on “empathy,” a concept that has gained a foothold in popular culture in recent years.  (I attribute some of that to Barack Obama’s focus on it, something I picked up on when I first heard him speak to a group of lawyers in Chicago in 2002, when he was still an Illinois state senator.)

Dance highlights in the film include not only Audrey’s jazz dance scene in the Left Bank café but also Fred’s dance scene with an umbrella and a coat lining that transforms into a cape.  The two leads share at least two memorable dance scenes, including the closing scene set in a charming landscape outside a Paris church.

Notably, after Audrey leaves NYC for Paris, she poses all over the City of Light in clothes designed by Givenchy, who became her favorite designer, and whose designs for this film seem timeless.  Also notably, she wears shoes with heels, but they’re invariably very low heels.  These became her favorite style of footwear.  (For some of the “inside Audrey” comments made here, please see my earlier blog post, “Audrey Hepburn and Me,” published on August 14. 2013.)

Finally, the age difference between Audrey and Fred is stark.  She was 28 while he was 58—and looked it.  Despite his agile dancing, he was an unlikely man for her to fall in love with.  But then Hollywood often paired her with much older men.  The all-time creepiest example was Gary Cooper in Love in the Afternoon.  (You can find my earlier comment on this topic in my 2013 blog post.)

In sum, “Funny Face” is a glorious film, featuring a radiant Audrey Hepburn, a clever storyline, and countless scenes of Paris.  The Gershwin songs and the wonderful dancing, which blend almost seamlessly into the story, lead to a stunning result.  Even though I didn’t fully appreciate it in 1957, the memory of seeing it back then has stayed with me for the past six decades.  Seeing it again made me realize just how “’s’wonderful” it really is.

 

 

 

Hooray for Hollywood! Part I

As a lifelong film buff (OK, since I was about 4), I have great fondness for much that Hollywood (and foreign cinema) has produced.  Each year I try to see a number of new films and re-watch some of the old ones.

During the past year, I never got around to seeing most of the blockbusters that dominated the box office. According to the online publication The Verge, Disney produced an unprecedented 80 percent of the top box-office hits in 2019.

Thanks to its purchase during the last decade of Marvel Entertainment (2009) and Lucasfilm (2012), Disney films have included franchises like Star Wars and the Marvel hits, in addition to popular animated films like Frozen and Frozen 2.  The result:  Disney films have surpassed many other films at the box office.

But I don’t pay a lot of attention to box-office success.  I’m far more focused on seeing films that have something to say to me. This year my clear favorite was Once Upon a Time…in Hollywood.

Once Upon a Time, a Quentin Tarantino film, is not only a fabulous depiction of Hollywood in 1969, but it also related to me and my life in a number of ways.

Spoiler alert:  If you haven’t yet seen this film, DO NOT read the ending of this blog post, where I write about the Manson murders.

First, about the film itself:  It’s been called a “buddy picture,” and in many ways it is.  In two stellar performances, Leonardo DiCaprio (playing the fictional Rick Dalton) and Brad Pitt (playing fictional Cliff Booth), are indeed buddies.  Rick is a fading former star of a Western TV series, trying to make a comeback in Hollywood, while Cliff is his longtime stunt double.  By 1969, with Rick’s star on the wane, Cliff spends much of his time driving Rick from place to place.  Both are struggling to survive in a Hollywood that has changed from the one they knew.

Weaving fiction and fact throughout the film, Tarantino uses both humor and violence to depict the end of an era.  In this love letter to 1960s Hollywood (which has earned positive reviews by most top critics on Rotten Tomatoes and garnered numerous awards and nominations), he embeds specifics of popular culture and real places in 1969 LA into the film.

 

The story takes place during two days in February and one day in August of 1969.  Notably, Rick Dalton’s home is right next door to the home of minor film star Sharon Tate (married to director Roman Polanski) in a posh section of western LA, Benedict Canyon.

In this film, Tarantino also skillfully blends in the ugly story of the Charles Manson “family.”

Re-creating in many ways the world that I lived in at about the same time, even if he himself did not, Tarantino provoked a cascade of intensely vivid memories for me.  Here’s why:

 

 

I left Chicago in August 1970 and moved to the Westwood neighborhood on the west side of LA, where I rented a cheerful furnished apartment within walking distance of UCLA.

I had moved my “Reggie Fellowship” from the Appellate and Test Case Division of the Chicago Legal Aid Bureau to a health-law related Legal Services office that was located at UCLA Law School.  Reggies were predominantly young lawyers who opted to work on behalf of the poor rather than toil in a corporate law firm.  (Please see my more detailed description of the Reggie program in an earlier post, “The Summer of ’69,” published on August 7. 2015.)

Westwood and Westwood Village (the commercial area in Westwood, adjacent to UCLA), loom large in my memory.  I met my husband-to-be (I’ll call him Marv) on the UCLA campus in October 1970, six weeks after I arrived.  Before we met, we had both rented separate apartments in the same apartment building located on the fringe of the campus. We soon began dating, and my memory bank is filled with countless memories related to our courtship and marriage that year.

My new location was very close to much of what happens in the Tarantino film only one year earlier.  So when he replicates things from that time, I recall seeing and hearing a lot of what looked like them myself.

Examples:  Street signs, ads painted on bus-stop benches, movie posters, commercials, and music. (Some of these are Tarantino’s own inventions.)

Probably the best example:  Sharon Tate goes to see herself in a film at a movie theater in Westwood Village.  During the year that I lived in Westwood, I saw many films at the movie theaters in Westwood Village.  (Seeing “Love Story” with Marv in one of them in December 1970 was especially memorable, and I plan to write about it in a future blog post.)

Another example:  A scene in the movie is set at the famous LA restaurant called Musso & Frank Grill.  Marv and I were both aware of its fame, and during that year we sought it out and dined there one special night.

One more thing:  The stunning area where Sharon Tate and Roman Polanski lived next door to the fictional Rick Dalton (Benedict Canyon) is in western LA, not far from Westwood and very close to BelAir.  Marv and I not only lived in Westwood, but we also celebrated our wedding luncheon at the charming BelAir Hotel.

Then there’s the Manson family storyline in the movie.  I learned about the Manson murders during a weekend in New York City.  I was spending part of the summer of 1969 at the Reggie training program at Haverford College, near Philadelphia, and I traveled from Philly to NYC one weekend in August

During trips to NYC, I often stayed with a close friend and a law-school classmate (I’ll call her Arlene).  Although Arlene was planning to be out of town that weekend, she invited me to stay in her 86th Street apartment on the East Side of Manhattan without her.  It was a great opportunity to live by myself as a quasi-New Yorker, and I decided to do it.

Returning to her apartment on Saturday evening, I picked up the Sunday New York Times and was shocked by a headline spelling out the startling discovery of the Manson murders.

At that time, I was still living in Chicago, but I had briefly lived in LA when I was 12 and always liked to follow any news arising there.  So I was riveted by the Manson story and read the paper from cover to cover.

When Tarantino decided to weave this story into the rest of his film, he did what he’d done in Inglourious Basterds and changed the real ending to a much different one.

Watching Once Upon a Time, I was terribly nervous as the film approached its ending.  I knew how the real story turned out, and I didn’t know exactly how this film would portray it.  But what a departure from reality Tarantino created!  The shocking ending to the film includes imaginative violence that is so over-the-top that it’s almost humorous.  Overall, the ending is a clever re-imagining of the fate of the Manson family and a much happier resolution of what happened to their victims.

Although the new ending was violent in its own way, creating an exciting piece of filmmaking, I left the theater in a much sunnier frame of mind than I would have if Tarantino had re-created the actual massacre that took place in 1969.

 

In sum, Once Upon a Time is, to my mind, an absorbing and a fascinating film.  For me, it was one of the best films of 2019.

 

I plan to write again about Hollywood films that have been relevant to my own life.  Part II will begin to explore classic films that have done just that.

 

 

A Snowy April 1st

On the morning of April 1st, The New York Times reported that the city had woken up to an April snowstorm, “with about 5 inches of snow expected to produce slushy streets and a tough morning commute.”  The storm followed a string of storms that had hit the East Coast in March with heavy snows and damaging winds.

This New York story about snow on April 1st reminded me of another April 1st snowstorm:  The one in Chicago that changed my life.

In the spring of 1970, I was already questioning whether I wanted to spend another year in Chicago.  My work at the Appellate and Test Case Division of the Chicago Legal Aid Bureau had its good points.  I was co-counsel with a lawyer at the Roger Baldwin Foundation of the ACLU (who happily became a lifelong friend) in a case challenging the restrictive Illinois abortion law, a law that made any abortion nearly impossible for all but the most affluent women in Illinois.  Our case was moving forward and had already secured a TRO allowing a teenage rape victim an emergency abortion.  A great legal victory!

But the rest of my life was at a standstill.  I was dating some of the men I’d met, but I hadn’t encountered anyone I wanted to pair up with.  In fact, I’d recently dumped a persistent suitor I found much too boring.  Relying on old friendships led to occasional lunches with both men and women I’d known in school, but the women were happily married and had limited time for a single woman friend.  I tried striking up friendships with other women as well as men, but so far that hadn’t expanded my social life very much.

I also haunted the Art Institute of Chicago, attending evening lectures and lunchtime events.  The art was exhilarating, but good times there were few.  When I turned up for an event one Sunday afternoon and left a few hours later, planning to take a bus home, I was surprised to see almost no one else on Michigan Avenue, leaving me feeling isolated and (in today’s parlance) somewhat creeped-out.  (In 1970 Chicago hadn’t yet embarked on the kind of Sunday shopping that would bring people downtown on a Sunday afternoon.)  Similarly, I bought tickets for a piano series at Symphony Hall, and a series of opera tickets, but again I many times felt alone among a group of strangers.

I still had lots of family in the area.  But being surrounded by family wasn’t exactly what I was looking for just then.

So although I was feeling somewhat wobbly about staying in Chicago, the question of where to settle instead loomed large.  When I’d left law school three years earlier and assumed a two-year clerkship with a federal judge in Chicago, I’d intended to head for Washington DC when my clerkship ended.  But in the interim Tricky Dick Nixon had lied his way into the White House, and I couldn’t abide the idea of moving there while he was in charge.

My thoughts then turned to California.  I’d briefly lived in Los Angeles during 8th grade (a story for another day) and very much wanted to stay, but my mother’s desire to return to Chicago after my father’s death won out.  Now I remembered how much I loved living in sunny California.  A February trip to Mexico had reinforced my thinking that I could happily live out my days in a warm-weather climate instead of slogging away in Chicago, winter after Chicago winter.

So I began making tentative efforts to seek out work in either LA or San Francisco, cities where I already had some good friends.

What happened on April 1st sealed the deal.  I’d made my way to work that morning despite the heavy snow that had fallen, and I took my usual ride home on a bus going down Michigan Avenue to where I lived just north of Oak Street.  The bus lumbered along, making its way through the snow-covered city, its major arteries by that time cleared by the city’s snow plows.  When the bus driver pulled up at the stop just across Lake Shore Drive from my apartment building, he opened the bus’s door, and I unsuspectingly descended the stairs to emerge outside.

Then, it happened.  I put a foot out the door, and it sank into a drift of snow as high as my knee.  I was wearing the miniskirts I favored back then, and my foot and leg were now stuck in the snow.  The bus abruptly closed its door, and I was left, stranded in a snowbank, forced to pull myself out of it and attempt to cross busy Lake Shore Drive.

On April 1st.

Then and there I resolved to leave Chicago.  No ifs, ands, or buts about it.  I made up my mind to leave the snow-ridden city and head for warmer climes.

And I did.  After a May trip to the sunny West Coast, where I interviewed for jobs in both Los Angeles and San Francisco (with kind friends hosting me in both cities), I wound up accepting a job offer at a poverty-law support center at UCLA law school and renting a furnished apartment just across Gayley Avenue from the campus.

The rest is (my personal) history.  I immediately loved my new home and my new job.  Welcomed by friends, both old and new (including my brand-new colleagues at UCLA), I was happy to have left Chicago and its dreary winters behind.  And six weeks after arriving in LA, I met the wonderful guy I married a few months later.

What happened next?  I’ll save that for still another day.  But here’s the take-away:  a snowstorm on April 1st changed my life.  Maybe it can change yours, too.

 

Hamilton, Hamilton…Who Was He Anyway?

Broadway megahit “Hamilton” has brought the Founding Parent (okay, Founding Father) into a spotlight unknown since his own era.

Let’s face it.  The Ron Chernow biography, turned into a smash Broadway musical by Lin-Manuel Miranda, has made Alexander Hamilton into the icon he hasn’t been–or maybe never was–in a century or two. Just this week, the hip-hop musical “Hamilton” received a record-breaking 16 Tony Award nominations.

His new-found celebrity has even influenced his modern-day successor, current Treasury Secretary Jack Lew, leading Lew to reverse his earlier plan to remove Hamilton from the $10 bill and replace him with the image of an American woman.

Instead, Hamilton will remain on the front of that bill, with a group representing suffragette leaders in 1913 appearing on the back, while Harriet Tubman will replace no-longer-revered and now-reviled President Andrew Jackson on the front of the $20 bill.  We’ll see other changes to our paper currency during the next five years.

But an intriguing question remains:  How many Americans—putting aside those caught up in the frenzy on Broadway, where theatergoers are forking over $300 and $400 to see “Hamilton” on stage—know who Hamilton really was?

A recent study done by memory researchers at Washington University in St. Louis has confirmed that most Americans are confident that Hamilton was once president of the United States.

According to Henry L. Roediger III, a human memory expert at Wash U, “Our findings from a recent survey suggest that about 71 percent of Americans are fairly certain that [Hamilton] is among our nation’s past presidents.  I had predicted that Benjamin Franklin would be the person most falsely recognized as a president, but Hamilton beat him by a mile.”

Roediger (whose official academic title is the James S. McDonnell Distinguished University Professor in Arts & Sciences) has been testing undergrad college students since 1973, when he first administered a test while he was himself a psychology grad student at Yale. His 2014 study, published in the journal Science, suggested that we as a nation do fairly well at naming the first few and the last few presidents.  But less than 20 percent can remember more than the last 8 or 9 presidents in order.

Roediger’s more recent study is a bit different because its goal was to gauge how well Americans simply recognize the names of past presidents.  Name-recognition should be much less difficult than recalling names from memory and listing them on a blank sheet of paper, which was the challenge in 2014.

The 2016 study, published in February in the journal Psychological Science, asked participants to identify past presidents, using a list of names that included actual presidents as well as famous non-presidents like Hamilton and Franklin.  Other familiar names from U.S. history, and non-famous but common names, were also included.

Participants were asked to indicate their level of certainty on a scale from zero to 100, where 100 was absolutely certain.

What happened?  The rate for correctly recognizing the names of past presidents was 88 percent overall, although laggards Franklin Pierce and Chester Arthur rated less than 60 percent.

Hamilton was more frequently identified as president (with 71 percent thinking that he was) than several actual presidents, and people were very confident (83 on the 100-point scale) that he had been president.

More than a quarter of the participants incorrectly recognized others, notably Franklin, Hubert Humphrey, and John Calhoun, as past presidents.  Roediger thinks that probably happened because people are aware that these were important figures in American history without really knowing what their actual roles were.

Roediger and his co-author, K. Andrew DeSoto, suggest that our ability to recognize the names of famous people hinges on their names appearing in a context related to the source of their fame.  “Elvis Presley was famous, but he would never be recognized as a past president,” Roediger says.   It’s not enough to have a familiar name.  It must be “a familiar name in the right context.”

This study is part of an emerging line of research focusing on how people remember history.  The recent studies reveal that the ability to remember the names of presidents follows consistent and reliable patterns.  “No matter how we test it—in the same experiment, with different people, across generations, in the laboratory, with online studies, with different types of tests—there are clear patterns in how the presidents are remembered and how they are forgotten,” DeSoto says.

While decades-old theories about memory can explain the results to some extent, these findings are sparking new ideas about fame and just how human memory-function treats those who achieve it.

As Roediger notes, “knowledge of American presidents is imperfect….”  False fame can arise from “contextual familiarity.”  And “even the most famous person in America may be forgotten in as short a time as 50 years.”

So…how will Alexander Hamilton’s new-found celebrity hold up?  Judging from the astounding success of the hip-hop musical focusing on him and his cohorts, one can predict with some confidence that his memory will endure far longer than it otherwise might have.

This time, he may even be remembered as our first Secretary of the Treasury, not as the president he never was.

Take a hike

The lure of “the gym” has always escaped me. I’ve joined a few fitness centers in my day, but I consistently end up abandoning the gym and resorting to my preferred route to fitness: walking. Whenever possible, I walk and hike in the great outdoors.

A host of recent studies has validated my faith in the benefits of walking. And some of these benefits may surprise you.

First, being active is better for your health. Duh. We’ve all suspected that for a long time. But here’s a new finding: sitting may be the real problem. Studies show that the more you sit, the greater your risk for health problems. In a study of more than two thousand adults ages 60 and older, every additional hour a day spent sitting was linked to a 50 percent greater risk of disability. Even those who got some exercise but were sitting too much were more likely to get dumped in the pool of disabled people.

Dorothy Dunlop and her colleagues at Northwestern’s McCormick School of Engineering and Applied Science concluded that sitting seems to be a separate risk factor. Getting enough exercise is important, but it’s equally important not to be a couch potato the rest of the time. Their study appeared in the Journal of Physical Activity & Health in 2014.

Another study, published in Medicine & Science in Sports & Exercise, noted something else about prolonged sitting: taking “short walking breaks” at least once an hour may lessen or even prevent some of the adverse effects, especially on the cardiovascular system. When healthy young men sat for 3 hours without moving their legs, endothelial function—the ability of blood vessels to expand and contract—dropped significantly from the very beginning. But when they broke up their sitting time with slow 5-minute walks every 30 or 60 minutes, endothelial function did not decline.

Here’s another benefit: Exercise, including walking, can keep you from feeling depressed. A British study, reported in JAMA Psychiatry, followed over 11,000 people (initially in their early 20s) for more than 25 years. It found that the more physically active they were, the less likely they were to have symptoms of depression. For example, sedentary people who started exercising 3 times a week reduced their risk of depression 5 years later by almost 20 percent. The researchers concluded that being active “can prevent and alleviate depressive symptoms in adulthood.”

Ready for one more reason to walk? A study described in The Wall Street Journal in 2014 found that walking can significantly increase creativity. This is a brand new finding. In the past, studies have shown that after exercise, people usually perform better on tests of memory and the ability to make decisions and organize thoughts. Exercise has also been linked anecdotally to creativity: writers and artists have said for centuries that their best ideas have come during a walk. But now science supports that link.

Researchers at Stanford University, led by Dr. Marily Oppezzo, decided to test the notion that walking can inspire creativity. They gathered a group of students in a deliberately unadorned room equipped with nothing more than a desk and a treadmill. The students were asked to sit and complete “tests of creativity,” like quickly coming up with alternative uses for common objects, e.g., a button. Facing a blank wall, the students then walked on the treadmill at an easy pace, repeating the creativity tests as they walked. Result: creativity increased when the students walked. Most came up with about 60 percent more “novel and appropriate” uses for the objects.

Dr. Oppezzo then tested whether these effects lingered. The students repeated the test when they sat down after their walk on the treadmill. Again, walking markedly improved their ability to generate creative ideas, even when they had stopped walking. They continued to produce more and better ideas than they had before their walk.

When Dr. Oppezzo moved the experiment outdoors, the findings surprised her. The students who walked outside did come up with more creative ideas than when they sat, either inside or outside, but walking outside did not lead to more creativity than walking inside on the treadmill. She concluded that “it’s the walking that matters.”

So a brief stroll apparently leads to greater creativity. But the reasons for it are unclear. According to Dr. Oppezzo, “It may be that walking improves mood,” and creativity blooms more easily when one is happier. The study appeared in The Journal of Experimental Psychology: Learning, Memory, and Cognition in 2014.

In truth, I don’t need these studies to convince me to keep walking. It helps that I live in San Francisco, where the climate allows me to walk outside almost every day. Walking is much more challenging when you confront the snow and ice that used to accompany my walks in and around Chicago. So I’m not surprised that walkers in colder climes often resort to exercising indoors.

It also helps that San Francisco has recently been voted the second most walkable city in America. According to Walk Score, an organization that ranks the “walkability” of 2,500 cities in the U.S., SF placed just behind New York City as the most walkable major American city.

SF’s high score is especially impressive in light of the city’s hills. Although I avoid the steepest routes, I actually welcome a slight incline because it adds to my aerobic workout. Why use a Stairmaster in a gloomy gym when I can climb uphill enveloped in sunshine and cool ocean breezes?

But whether you walk indoors or out, do remember to walk! You’ll assuredly benefit health-wise. And you just may enhance your creativity quotient. Someday you may even find yourself writing a blog like this one.