Category Archives: president

A Snowy April 1st

On the morning of April 1st, The New York Times reported that the city had woken up to an April snowstorm, “with about 5 inches of snow expected to produce slushy streets and a tough morning commute.”  The storm followed a string of storms that had hit the East Coast in March with heavy snows and damaging winds.

This New York story about snow on April 1st reminded me of another April 1st snowstorm:  The one in Chicago that changed my life.

In the spring of 1970, I was already questioning whether I wanted to spend another year in Chicago.  My work at the Appellate and Test Case Division of the Chicago Legal Aid Bureau had its good points.  I was co-counsel with a lawyer at the Roger Baldwin Foundation of the ACLU (who happily became a lifelong friend) in a case challenging the restrictive Illinois abortion law, a law that made any abortion nearly impossible for all but the most affluent women in Illinois.  Our case was moving forward and had already secured a TRO allowing a teenage rape victim an emergency abortion.  A great legal victory!

But the rest of my life was at a standstill.  I was dating some of the men I’d met, but I hadn’t encountered anyone I wanted to pair up with.  In fact, I’d recently dumped a persistent suitor I found much too boring.  Relying on old friendships led to occasional lunches with both men and women I’d known in school, but the women were happily married and had limited time for a single woman friend.  I tried striking up friendships with other women as well as men, but so far that hadn’t expanded my social life very much.

I also haunted the Art Institute of Chicago, attending evening lectures and lunchtime events.  The art was exhilarating, but good times there were few.  When I turned up for an event one Sunday afternoon and left a few hours later, planning to take a bus home, I was surprised to see almost no one else on Michigan Avenue, leaving me feeling isolated and (in today’s parlance) somewhat creeped-out.  (In 1970 Chicago hadn’t yet embarked on the kind of Sunday shopping that would bring people downtown on a Sunday afternoon.)  Similarly, I bought tickets for a piano series at Symphony Hall, and a series of opera tickets, but again I many times felt alone among a group of strangers.

I still had lots of family in the area.  But being surrounded by family wasn’t exactly what I was looking for just then.

So although I was feeling somewhat wobbly about staying in Chicago, the question of where to settle instead loomed large.  When I’d left law school three years earlier and assumed a two-year clerkship with a federal judge in Chicago, I’d intended to head for Washington DC when my clerkship ended.  But in the interim Tricky Dick Nixon had lied his way into the White House, and I couldn’t abide the idea of moving there while he was in charge.

My thoughts then turned to California.  I’d briefly lived in Los Angeles during 8th grade (a story for another day) and very much wanted to stay, but my mother’s desire to return to Chicago after my father’s death won out.  Now I remembered how much I loved living in sunny California.  A February trip to Mexico had reinforced my thinking that I could happily live out my days in a warm-weather climate instead of slogging away in Chicago, winter after Chicago winter.

So I began making tentative efforts to seek out work in either LA or San Francisco, cities where I already had some good friends.

What happened on April 1st sealed the deal.  I’d made my way to work that morning despite the heavy snow that had fallen, and I took my usual ride home on a bus going down Michigan Avenue to where I lived just north of Oak Street.  The bus lumbered along, making its way through the snow-covered city, its major arteries by that time cleared by the city’s snow plows.  When the bus driver pulled up at the stop just across Lake Shore Drive from my apartment building, he opened the bus’s door, and I unsuspectingly descended the stairs to emerge outside.

Then, it happened.  I put a foot out the door, and it sank into a drift of snow as high as my knee.  I was wearing the miniskirts I favored back then, and my foot and leg were now stuck in the snow.  The bus abruptly closed its door, and I was left, stranded in a snowbank, forced to pull myself out of it and attempt to cross busy Lake Shore Drive.

On April 1st.

Then and there I resolved to leave Chicago.  No ifs, ands, or buts about it.  I made up my mind to leave the snow-ridden city and head for warmer climes.

And I did.  After a May trip to the sunny West Coast, where I interviewed for jobs in both Los Angeles and San Francisco (with kind friends hosting me in both cities), I wound up accepting a job offer at a poverty-law support center at UCLA law school and renting a furnished apartment just across Gayley Avenue from the campus.

The rest is (my personal) history.  I immediately loved my new home and my new job.  Welcomed by friends, both old and new (including my brand-new colleagues at UCLA), I was happy to have left Chicago and its dreary winters behind.  And six weeks after arriving in LA, I met the wonderful guy I married a few months later.

What happened next?  I’ll save that for still another day.  But here’s the take-away:  a snowstorm on April 1st changed my life.  Maybe it can change yours, too.

 

Munching on Meatloaf

Meatloaf, that old standby, has just acquired a new cachet.  Or has it?

A recent column by Frank Bruni in The New York Times focused on food snobs, in particular their ridicule of Donald Trump’s love of meatloaf.  Weeks earlier, Trump had “forced Chris Christie to follow his lead at a White House lunch and eat meatloaf, which the president praised as his favorite item on the menu.”

According to Bruni, a former restaurant critic, news coverage of the lunch “hinted that Trump wasn’t merely a bully but also a rube.  What grown-up could possibly be so fond of this retro, frumpy dish?”

Bruni’s answer:  “Um, me.  I serve meatloaf at dinner parties.  I devoted a whole cookbook to it.”

Allow me to join forces with Frank Bruni.  Putting aside my general negativity towards all things Trump, I have to admit I’m fond of meatloaf, too.

My recollections of eating meatloaf go back to the dining-room table in our West Rogers Park apartment in the 1950s.  My mother was never an enthusiastic cook.  She prepared meals for us with a minimal degree of joy, no doubt wishing she could spend her time on other pursuits.  It was simply expected of her, as the wife and mother in our mid-century American family, to come up with some sort of breakfast, lunch, and dinner nearly every day.

Breakfasts rarely featured much more than packaged cereal and milk.  I remember putting a dusting of sugar on corn flakes—something I haven’t done since childhood.  Did we add fresh fruit to our cereal?  Not very often.  We might have added raisins.   But fresh fruit, like the abundant blueberries and strawberries we can now purchase all year long, wasn’t available in Chicago grocery stores during our long cold ‘50s winters.  At least not in our income bracket.

Daddy occasionally made breakfast on the weekends.  I remember watching him standing in front of our ‘30s-style mint green enamel-covered stove, whipping up his specialty, onions and eggs, with aplomb.  But those highly-anticipated breakfasts were rare.

[I recently discovered that stoves like that one are still available.  They’re advertised online by a “retro décor lover’s dream resource” in Burbank, as well as on eBay, where an updated model is currently listed for $4,495.]

As for lunch, my public grade school compelled us to walk home for lunch every day.  Only a handful of sub-zero days broke that mold.  Our school had no cafeteria, or even a lunchroom, where kids could eat in frigid weather.  Only on alarmingly cold days were we permitted to bring a lunch from home and eat it in the school auditorium.  If we pleaded convincingly enough, our parents might let us buy greasy hamburgers at Miller’s School Store.

Most days I’d walk home, trudging the six long blocks from school to home and back within an hour. Mom would have lunch waiting for me on our breakfast-room table, mostly sandwiches and the occasional soup.  Mom rarely made her own soup.  She generally opened cans of Campbell’s “vegetable vegetarian,” eschewing canned soups that included any possibility of unknown meat.

Mom’s dinner specialties included iceberg-lettuce salads, cooked veggies and/or potatoes, and a protein of some kind.  Because of her upbringing, she invariably chose fish, poultry, or cuts of meats like ground beef, beef brisket, and lamb chops.

Which brings us to meatloaf.

I must have liked Mom’s meatloaf because I don’t have a single negative memory associated with it.  And when I got married and began preparing meals for my own family, I never hesitated to make meatloaf myself.

Fortunately, I didn’t have to prepare dinner every night.  I was immensely lucky to marry a man who actually enjoyed cooking.  Although I inherited my mother’s reluctance to spend much time in the kitchen, Herb relished preparing elaborate gourmet dishes á la Julia Child—in fact, he often used her cookbook—and proudly presenting them to our daughters and me whenever his schedule allowed.

But when I was the cook, meatloaf was one of my favorite choices.  I’d buy lean ground beef, add breadcrumbs, ketchup, and assorted herbs and spices, mix it all together with my bare hands, and heat the finished product until it was just right.  Aware by then of warnings about high-fat red meat, I’d carefully remove my loaf pan from the oven and scrupulously drain as much fat from the pan as I could.  The result?  A tasty and relatively low-fat dish.  My family loved it.

At some point I discovered the glories of leftover meatloaf.  Chilled in the fridge overnight, it made a toothsome sandwich the next day.  It was especially good on rye bread and loaded with ketchup.  Wrapped in a plastic baggie, it would go from home to wherever I traveled to work, and I had to use my most stalwart powers of self-discipline to wait till lunchtime to bite into its deliciousness.

Those days are sadly over.  I rarely prepare dinner for my family anymore, and my consumption of meat products has gone way down.  Most days, when I reflect on what I’ve eaten, I realize that, more often than not, I’ve unknowingly eaten a wholly vegetarian diet.

I haven’t eaten meatloaf in years.  But hearing about Trump’s penchant for it has awakened my tastebuds.  If I could just get my hands on a tasty low-fat version like the one I used to make, my long meatloaf-drought might finally be over.

A Day Without a Drug Commercial

Last night I dreamed there was a day without a drug commercial….

When I woke up, reality stared me in the face.  It couldn’t be true.  Not right now.  Not without revolutionary changes in the drug industry.

Here are some numbers that may surprise you.  Or maybe not.

Six out of ten adults in the U.S. take a prescription medication.  That’s up from five out of ten a decade ago.  (These numbers appeared in a recent study published in the Journal of the American Medical Association.)

Further, nine out of ten people over 65 take at least one drug, and four out of ten take five or more—nearly twice as many as a decade ago.

One more statistic:  insured adults under 65 are twice as likely to take medication as the uninsured.

Are you surprised by any of these numbers?  I’m not.

Until the 1990s, drug companies largely relied on physicians to promote their prescription drugs. But in 1997, the Food and Drug Administration revised its earlier rules on direct-to-consumer (DTC) advertising, putting fewer restrictions on the advertising of pharmaceuticals on TV and radio, as well as in print and other media.  We’re one of only two countries–New Zealand is the other one–that permit this kind of advertising.

The Food and Drug Administration is responsible for regulating it and is supposed to take into account ethical and other concerns to prevent the undue influence of DTC advertising on consumer demand.  The fear was that advertising would lead to a demand for medically unnecessary prescription meds.

It’s pretty clear to me that it has.  Do you agree?

Just look at the statistics.  The number of people taking prescription drugs increases every year.  In my view, advertising has encouraged them to seek drugs that may be medically unnecessary.

Of course, many meds are essential to preserve a patient’s life and health.  But have you heard the TV commercials?  Some of them highlight obscure illnesses that affect a small number of TV viewers.  But whether we suffer from these ailments or not, we’re all constantly assaulted by these ads.  And think about it:  If you feel a little under the weather one day, or a bit down in the dumps because of something that happened at work, or just feeling stressed because the neighbor’s dog keeps barking every night, might those ads induce you to call your doc and demand a new drug to deal with it?

The drug commercials appear to target those who watch daytime TV—mostly older folks and the unemployed.  Because I work at home, I sometimes watch TV news while I munch on my peanut butter sandwich.  But if I don’t hit the mute button fast enough, I’m bombarded by annoying ads describing all sorts of horrible diseases.  And the side effects of the drugs?  Hearing them recited (as rapidly as possible) is enough to make me lose my appetite.  One commercial stated some possible side effects:  suicidal thoughts or actions; new or worsening depression; blurry vision; swelling of face, mouth, hands or feet; and trouble breathing.  Good grief!  The side effects sounded worse than the disease.

I’m not the only one annoyed by drug commercials.  In November 2015, the American Medical Association called for a ban on DTC ads of prescription drugs. Physicians cited genuine concerns that a growing proliferation of ads was driving the demand for expensive treatments despite the effectiveness of less costly alternatives.  They also cited concerns that marketing costs were fueling escalating drug prices, noting that advertising dollars spent by drug makers had increased by 30 percent in the previous two years, totaling $4.5 billion.

The World Health Organization has also concluded that DTC ads promote expensive brand-name drugs.  WHO has recommended against allowing DTC ads, noting surveys in the US and New Zealand showing that when patients ask for a specific drug by name, they receive it more often than not.

Senator Bernie Sanders has repeatedly stated that Americans pay the highest prices in the world for prescription drugs.  He and other Senators introduced a bill in 2015 aimed at skyrocketing drug prices, and Sanders went on to rail against them during his 2016 presidential campaign.

Another member of Congress, Representative Rosa DeLauro (D-Conn.), has introduced a bill specifically focused on DTC ads.  Calling for a three-year moratorium on advertising new prescription drugs directly to consumers, the bill would freeze these ads, with the aim of holding down health-care costs.

DeLauro has argued, much like the AMA, that DTC ads can inflate health-care costs if they prompt consumers to seek newer, higher-priced meds.  The Responsibility in Drug Advertising Act would amend the current Food, Drug, and Cosmetic Act and is the latest effort to squelch DTC advertising of prescription meds.

The fact that insured adults under 65 are twice as likely to take prescription meds as those who are not insured highlights a couple of things:  That these ads are pretty much about making more and more money for the drug manufacturers.  And that most of the people who can afford them are either insured or in an over-65 program covering many of their medical expenses.  So it’s easy to see that manufacturers can charge inflated prices because these consumers are reimbursed by their insurance companies.  No wonder health insurance costs so much!  And those who are uninsured must struggle to pay the escalating prices or go without the drugs they genuinely need.

Not surprisingly, the drug industry trade group, the Pharmaceutical Research and Manufacturers of America, has disputed the argument that DTC ads play “a direct role in the cost of new medicines.”  It claims that most people find these ads useful because they “tell people about new treatments.”  It’s probably true that a few ads may have a public-health benefit.  But I doubt that very many fall into that category.

Hey, Big Pharma:  If I need to learn about a new treatment for a health problem, I’ll consult my physician.  I certainly don’t plan to rely on your irritating TV ads.

But…I fear that less skeptical TV viewers may do just that.

So please, take those ads off the air.  Now.

If you do, you know what?  There just might be a day without a drug commercial….

 

[The Wellness Letter published by the University of California, Berkeley, provided the statistics noted at the beginning of this post.]

 

Link

Looking Back…The Election of 1984 (Part II)

I wrote Part I of this blog post in late 1984.  In Part I, I commented on the campaign for president and vice president that had occurred that fall.

Part II, also written in 1984, offered my thoughts at the time about what might take place post-1984.

During the past 32 years, we’ve seen another major political party nominate a woman to be vice president.  In my view, the selection of Sarah Palin as that candidate in 2008 was John McCain’s replication of Walter Mondale’s unhappy selection of Geraldine Ferraro.  It was perhaps even more detrimental to McCain because he probably had a better chance of being elected president than Mondale had in 1984. Palin was even more untested as a political figure than Ferraro, having served only as a suburban mayor and a recently elected governor of a small state.  She soon demonstrated her lack of experience and knowledge of national issues, making her a genuine liability for McCain, who lost the support of many voters who might have otherwise been inclined to vote for him.

In 2016, American voters finally have the opportunity to select a woman as their president.  This time she’s a woman with a great deal of experience in public life and vast knowledge of the issues confronting our nation.  Although, as a candidate, Hillary Clinton hasn’t inspired unbridled enthusiasm, she’s as close to a “woman candidate of national stature” (to use my own words) as we’ve ever had.  In 1984, I predicted that a “woman candidate of national stature” whose position “represents the majority thinking in this country” would be “a realistic candidate,…and she will win.”

Was I right?

Here’s exactly what I wrote in 1984:

 

PART II

How does this leave things for the future?  Putting aside the personal future of Geraldine Ferraro, which is probably bright, what about other women candidates?  And what about the possibility of any woman being nominated and elected to the presidency or vice presidency of this country?  The Mondale-Ferraro defeat should not and must not be read as a defeat for women candidates in general.  Ferraro’s assets, both as a candidate and as a human being, are considerable, but, to be honest, she joined the campaign largely unknown and untested.
Another woman candidate might well fare otherwise.

Twenty-five years ago [i.e., in 1959], Margaret Chase Smith, a well-known and respected Republican U.S. Senator from Maine, announced her candidacy for the presidency.  She never had a realistic shot at it in that benighted era, but she might have had one in the 1980s.  She had established herself through a number of terms in the House of Representatives and the Senate, had climbed up the ladder in the Senate to committee chairmanships, and had become a recognized and admired figure on the national political scene.  A woman presenting similar credentials in the 1980s would bring a credibility to a national ticket that Ferraro, as a relative newcomer to the political arena, could not.  For this reason it’s important that women continue to run for political office on the state and local level, building political careers that will lead to the White House after they have achieved national stature—not before.

In all of the fuss made over Ferraro’s candidacy, something important was forgotten.  It’s not desirable for any political party to nominate a candidate solely or even primarily because that candidate is a woman or a black or a Hispanic—or a white Anglo male, for that matter.  The selection process must be based on the totality of what any given candidate will bring to the office.  The Democrats were wrong to select a woman candidate largely because she was a woman (those who said that a man with Ferraro’s credentials would never have been considered were—however painful it is to admit—correct).  They were wrong because Americans do not, and should not, vote for “symbols.”  When it became clear that Jesse Jackson wasn’t a candidate with a broad-based constituency but had become a “black” candidate and nothing more, that was the death knell for any realistic chance he had of winning the nomination.  But saying that is not saying that no black candidate can ever win.

Women candidates and candidates who are members of minority groups have run for office and won broad-based electoral support where they have been viewed as representing the best interests of a majority of the electorate.  But women and others who are viewed as “symbols,” representing only that segment of the electorate from which they came, will never win that sort of broad-based support.  On the contrary, their candidacies may serve only to polarize voters, leading to strife and bitterness among the electorate, and probable if not certain defeat at the ballot box.

When Mondale chose Ferraro, he already had the votes of the politically aware women for whom Ferraro became a symbol by virtue of his position on such issues as the ERA [the Equal Rights Amendment] and [the issue of] comparable worth.  He would not have lost the votes of those women no matter what else he did.  Likewise, Reagan didn’t have the votes of those women and wouldn’t have had them no matter what he did.  Even in the unimaginable event that Reagan had selected a woman running-mate, she would have had to be a woman whose thinking was compatible with his, and if she had endorsed Reagan’s views on the ERA (á la Phyllis Schlafly), feminists wouldn’t have been any more likely to vote for Reagan-Schlafly than Reagan-Bush.  It shouldn’t therefore be terribly difficult to understand why women who were otherwise happy with Reagan weren’t inclined to switch to Mondale simply because of Ferraro.

In sum, women voters are really not very different from men voters, and Democratic strategists who thought otherwise were proved wrong in 1984.  Women vote their interests, and these do not necessarily coincide with what is popularly perceived as “women’s” interests.  Women, like men, are concerned about the economy, our country’s status in the world, and a host of other matters along with the particular concerns they may have as women.

When a woman candidate of national stature emerges whose position on these interests represents the majority thinking in the country, she will be a realistic candidate for the vice presidency or the presidency, and she will win.

Looking Back…The Election of 1984

If you’ve followed politics for as long as I have, you probably remember the election of 1984.  In the race for U.S. president, Ronald Reagan was the Republican incumbent, first elected in 1980, and seeking to be re-elected in 1984.  Most observers predicted that he would succeed.

Opposing him was the Democratic nominee, Walter Mondale.

I found the campaign for president so absorbing that shortly after Mondale lost, I wrote a piece of commentary on the election.  Somewhat astoundingly, I recently came across that long-lost piece of writing.

Regrettably, I never submitted it for publication.  Why?  In 1984 I was active in local politics (the New Trier Democratic Organization, to be specific), and I was apprehensive about the reaction my comments might inspire in my fellow Democrats.

Reviewing it now, I wish I’d submitted it for publication.

On June 11th of this year, after Hillary Clinton appeared to be the Democratic nominee for president, The New York Times published a front-page story by Alison Mitchell, “To Understand Clinton’s Moment, Consider That It Came 32 Years After Ferraro’s.”  Mitchell’s article is a brilliant review of what happened in 1984 and during the 32 years since.  My commentary is different because it was actually written in 1984, and it presents the thinking of a longstanding political observer and a lifelong Democrat at that point in time.

Here’s the commentary I wrote just after the election in November 1984.  It was typed on an Apple IIe computer (thanks, Steve Wozniak) and printed on a flimsy dot-matrix printer.  It’s almost exactly what I wrote back then, minimally edited, mostly to use contractions and omit completely unnecessary words.  I’ve divided it into two parts because of its length.

 

PART I

Although Walter Mondale conducted a vigorous and courageous campaign, perhaps nothing he did or did not do would have altered the ultimate result.  But his fate was probably sealed last July when he made two costly political mistakes.  He chose to tell the American people that he’d increase taxes, and he chose Geraldine Ferraro as his running mate.

Savvy political observers have always known that talk of increased taxes is the kiss of death for any candidate.  One wonders what made Walter Mondale forget this truism and instead decide to impress the electorate with his honesty by telling them what they had to know (or, rather, what he thought they had to know) about the deficit.  By making the deficit—a highly intangible concept to the average American voter—a cornerstone of his campaign, Mondale committed the political gaffe of the decade.  One can imagine the glee in the White House the night Mondale gave his acceptance speech and tipped his hand.  The most popular theme of the Reagan campaign became identifying Mondale with the idea of “tax, tax, tax; spend, spend, spend,” a theme that had spelled doom for Jimmy Carter and came to do the same for his Vice President.

Mondale’s choice of Geraldine Ferraro as his running mate was surely not a gaffe of the magnitude of his promise to increase taxes, but as a political judgment it was almost equally unwise.  Mondale faced a popular incumbent president.  All the signposts, even back in July, indicated that the American people were largely satisfied with Reagan and willing to give him another term.  To unseat a popular sitting president, Mondale—who’d been through a bloody primary campaign and emerged considerably damaged—had to strengthen his ticket by choosing a running mate with virtually no liabilities.  He simply couldn’t afford them.

Some of the best advice Mondale got all year was George McGovern’s suggestion that he choose Gary Hart for his vice president.  In one stroke, Mondale could have won the support of those backing his most formidable opponent, many of whom had threatened to go over to Reagan if their candidate wasn’t nominated.  Like Reagan in 1980, Mondale could have solidified much of the divided loyalty of his party behind him by choosing the opponent who’d come closest in arousing voters’ enthusiasm.  Instead he chose to pass over Hart and several other likely candidates and to select a largely unknown three-term congresswoman from New York City.

It pains me, as a feminist and an ardent supporter of women’s rights, to say this, but it must be said:  Mondale’s choice of Ferraro, however admirable, was a political mistake.  When the pressure from NOW and others to choose a woman candidate arose and gradually began to build, I felt uneasy.  When Congresswoman Patricia Schroeder (for whom I have otherwise unlimited respect) announced that if Mondale didn’t choose Hart, he had to choose a woman, my uneasiness increased.  And when Mondale at last announced his choice of Ferraro, my heart sank.  I was personally thrilled that a woman was at last on a national ticket, but I knew immediately that the election was lost, and that everything a Mondale administration might have accomplished in terms of real gains for women had been wiped out by his choice of a woman running-mate.

There was no flaw in Ferraro herself that ensured the defeat of the Mondale-Ferraro ticket.  She’s an extremely bright, attractive, competent congresswoman and proved herself to be a gifted and inspiring V.P. candidate.  She has, by accepting the nomination, carved out a secure place for herself in the history books and maybe a significant role in national politics for decades to come.  She deserves all this and perhaps more.  But one must wonder whether even Ferraro in her own secret thoughts pondered the political wisdom of her choice as Mondale’s running mate.  If she is as good a politician as I think she is, I can’t help thinking that she herself must have wondered, “Why me, when he could have anyone else?  Will I really help the ticket? Well, what the hell, I’ll give it a shot!  It just might work.”

And it just might—someday.  But in 1984, up against a “Teflon President,” Mondale needed much more.  Reagan was playing it safe, and Mondale wasn’t.  Some observers applauded his choice of Ferraro as the kind of bold, courageous act he needed to bring excitement to a dull, plodding campaign.  But American voters weren’t looking for bold and courageous acts.  They wanted a President who didn’t rock the boat–a boat with which they were largely satisfied.  They might have been willing to throw out the current occupant of the White House if Mondale had been able to seize upon some popular themes and use them to his advantage.  Instead, the Reagan administration seized upon the tax-and-spend issue and the relatively good status of the economy to ride to victory while Mondale was still groping for a theme that might do the same for him.  And all the while he had a running mate with a liability:  a woman who had no national political stature and who turned out to have considerable problems of her own (notably, a messy financial situation).

Mondale’s choice of Ferraro was compared by Reagan to his appointment of Sandra Day O’Connor to the U.S. Supreme Court.  In the sense that both men selected highly capable but little-known women and in one stroke catapulted them to the top of their professions, Reagan was right.  But Reagan’s choice was very different and politically much smarter.  A V.P. candidate must be judged by the entire American electorate; a Supreme Court nominee is judged only by the U.S. Senate.  A vice president must stand alone, the metaphorical heartbeat away from the presidency; a Supreme Court justice is only one of nine judges on a court where most issues are not decided 5 to 4.  [We all recognize that this description of the Court in 1984 no longer fits in 2016.  But a single justice on the Court is still only one of nine.]

Let’s face it:  the notion of a woman V.P. (and the concomitant possibility of a woman president) is one that some Americans are clearly not yet comfortable with.  Although 16 percent of the voters polled by one organization said that they were more inclined to vote for Mondale because of Ferraro, 26 percent said they were less likely to.  It doesn’t take a mathematical whiz to grasp that 26 is more than 16.  These statistics also assume that the 55 percent who said that Ferraro’s sex was not a factor either way were being absolutely candid, which is doubtful.  Many men and women who are subconsciously uncomfortable with the idea of a woman president are understandably reluctant to admit it, to themselves perhaps as much as to others.