Review of: After the Apocalypse: America’s Role in the World Transformed, by Andrew Bacevich

I often suffer pangs of guilt when a volume received through an early reviewer program languishes on the shelf unread for an extended period. Such was the case with the “Advanced Reader’s Edition” of After the Apocalypse: America’s Role in the World Transformed, by Andrew Bacevich, that arrived in August 2021 and sat forsaken for an entire year until it finally fell off the top of my TBR (To-Be-Read) list and onto my lap. While hardly deliberate, my delay was no doubt neglectful. But sometimes neglect can foster unexpected opportunities for evaluation. More on that later.

First, a little about Andrew Bacevich. A West Point graduate and platoon leader in Vietnam 1970-71, he went on to an army career that spanned twenty-three years, including the Gulf War, retiring with the rank of Colonel. (It is said his early retirement was due to being passed over for promotion after taking responsibility for an accidental explosion at a camp he commanded in Kuwait.) He later became an academic, Professor Emeritus of International Relations and History at Boston University, and one-time director of its Center for International Relations (1998-2005). He is now president and co-founder of the bipartisan think-tank, the Quincy Institute for Responsible Statecraft. Deeply influenced by the theologian and ethicist Reinhold Niebuhr, Bacevich was once tagged as a conservative Catholic historian, but he defies simple categorization, most often serving as an unlikely voice in the wilderness decrying America’s “endless wars.” He has been a vocal, longtime critic of George W. Bush’s doctrine of preventative war, most prominently manifested in the Iraqi conflict, which he has rightly termed a “catastrophic failure.” He has also denounced the conceit of “American Exceptionalism,” and chillingly notes that the reliance on an all-volunteer military force translates into the ongoing, almost anonymous sacrifice of our men and women for a nation that largely has no skin in the game. His own son, a young army lieutenant, was killed in Iraq in 2007.  I have previously read three other Bacevich works. As I noted in a review of one of these, his resumé attaches to Bacevich either enormous credibility or an axe to grind, or perhaps both. Still, as a scholar and gifted writer, he tends to be well worth the read.

The “apocalypse” central to the title of this book takes aim at the chaos that engulfed 2020, spawned by the sum total of the “toxic and divisive” Trump presidency, the increasing death toll of the pandemic, an economy in free fall, mass demonstrations by Black Lives Matter proponents seeking long-denied social justice, and rapidly spreading wildfires that dramatically underscored the looming catastrophe of global climate change. [p.1-3] Bacevich takes this armload of calamities as a flashing red signal that the country is not only headed in the wrong direction, but likely off a kind of cliff if we do not immediately take stock and change course. He draws odd parallels with the 1940 collapse of the French army under the Nazi onslaught, which—echoing French historian Marc Bloch—he lays to “utter incompetence” and “a failure of leadership” at the very top. [p.xiv] This then serves as a head-scratching segue into a long-winded polemic on national security and foreign policy that recycles familiar Bacevich themes but offers little in the way of fresh analysis. This trajectory strikes as especially incongruent given that the specific litany of woes besetting the nation that populate his opening narrative have—rarely indeed for the United States—almost nothing to do with the military or foreign affairs.

If ever history was to manufacture an example of a failure of leadership, of course, it would be hard-pressed to come up with a better model than Donald Trump, who drowned out the noise of a series of mounting crises with a deafening roar of self-serving, hateful rhetoric directed at enemies real and imaginary, deliberately ignoring the threat of both coronavirus and climate change, while stoking racial tensions. Bacevich gives him his due, noting that his “ascent to the White House exposed gaping flaws in the American political system, his manifest contempt for the Constitution and the rule of law placing in jeopardy our democratic traditions.” [p.2] But while he hardly masks his contempt for Trump, Bacevich makes plain that there’s plenty of blame to go around for political elites in both parties, and he takes no prisoners, landing a series of blows on George W. Bush, Barack Obama, Hillary Clinton, Joe Biden, and a host of other members of the Washington establishment that he holds accountable for fostering and maintaining the global post-Cold War “American Empire” responsible for the “endless wars” that he has long condemned. He credits Trump for urging a retreat from alliances and engagements, but faults the selfish motives of an “America First” predicated on isolationism. Bacevich instead envisions a more positive role for the United States in the international arena—one with its sword permanently sheathed.

All this is heady stuff, and regardless of your politics many readers will find themselves nodding their heads as Bacevich makes his case, outlining the many wrongheaded policy endeavors championed by Republicans and Democrats alike for a wobbly superpower clinging to an outdated and increasingly irrelevant sense of national identity that fails to align with the global realities of the twenty-first century.  But then, as Bacevich looks to the future for alternatives, as he seeks to map out on paper the next new world order, he stumbles, and stumbles badly, something only truly evident in retrospect when viewing his point of view through the prism of the events that followed the release of After the Apocalypse in June 2021.

Bacevich has little to add here to his longstanding condemnation of the U.S. occupation of Afghanistan, which after two long decades of failed attempts at nation-building came to an end with our messy withdrawal in August 2021, just shortly after this book’s publication. President Biden was pilloried for the chaotic retreat, but while his administration could rightly be held to account for a failure to prepare for the worst, the elephant in that room in the Kabul airport where the ISIS-K suicide bomber blew himself up was certainly former president Trump, who brokered the deal to return Afghanistan to Taliban control. Biden, who plummeted in the polls due to outcomes he could do little to control, was disparaged much the same way Obama once was when he was held to blame for the subsequent turmoil in Iraq after effecting the withdrawal of U.S. forces agreed to by his predecessor, G.W. Bush. Once again, history rhymes.  But the more salient point for those of us who share, as I do, Bacevich’s anti-imperialism, is that getting out is ever more difficult than going in.

But Bacevich has a great deal to say in After the Apocalypse about NATO, an alliance rooted in a past-tense Cold War stand-off that he pronounces counterproductive and obsolete. Bacevich disputes the long-held mythology of the so-called “West,” an artificial “sentiment” that has the United States and European nations bound together with common values of liberty, human rights, and democracy. Like Trump—who likely would have acted upon this had he been reelected—Bacevich calls for an end to US involvement with NATO. The United States and Europe have embarked on “divergent paths,” he argues, and that is as it should be. The Cold War is over. Relations with Russia and China are frosty, but entanglement in an alliance like NATO only fosters acrimony and fails to appropriately adapt our nation to the realities of the new millennium.

It is an interesting if academic argument that was abruptly crushed under the weight of the treads of Russian tanks in the premeditated invasion of Ukraine February 24, 2022. If some denied the echo of Hitler’s 1938 Austrian Anschluss to Putin’s 2014 annexation of Crimea, there was no mistaking the similarity of unprovoked attacks on Kyiv and sister cities to the Nazi war machine’s march on Poland in 1939. And yes, when Biden and French President Emmanuel Macron stood together to unite that so-called West against Russian belligerence, the memory of France’s 1940 defeat was hardly out of mind. All of a sudden, NATO became less a theoretical construct and somewhat more of a safe haven against brutal militarism, wanton aggression, and the unapologetic war crimes that livestream on twenty-first century social media of streets littered with the bodies of civilians, many of them children. All of a sudden, NATO is pretty goddamned relevant.

In all this, you could rightly argue against the wrong turns made after the dissolution of the USSR, of the failure of the West to allocate appropriate economic support for the heirs of the former Soviet Union, of how a pattern of NATO expansion both isolated and antagonized Russia. But there remains no legitimate defense for Putin’s attempt to invade, besiege, and absorb a weaker neighbor—or at least a neighbor he perceived to be weaker, a misstep that could lead to his own undoing. Either way, the institution we call NATO turned out to be something to celebrate rather than deprecate. The fact that it is working exactly the way it was designed to work could turn out to be the real road map to the new world order that emerges in the aftermath of this crisis. We can only imagine the horrific alternatives had Trump won re-election: the U.S. out of NATO, Europe divided, Ukraine overrun and annexed, and perhaps even Putin feted at a White House dinner. So far, without firing a shot, NATO has not only saved Ukraine; arguably, it has saved the world as we know it, a world that extends well beyond whatever we might want to consider the “West.”

As much as I respect Bacevich and admire his scholarship, his informed appraisal of our current foreign policy realities has turned out to be entirely incorrect. Yes, the United States should rein in the American Empire.  Yes, we should turn away from imperialist tendencies. Yes, we should focus our defense budget solely on defense, not aggression, resisting the urge to try to remake the world in our own image for either altruism or advantage. But at the same time, we must be mindful—like other empires in the past—that retreat can create vacuums, and we must be ever vigilant of what kinds of powers may fill those vacuums.  Because we can grow and evolve into a better nation, a better people, but that evolution may not be contagious to our adversaries. Because getting out remains ever more difficult than going in.

Finally, a word about the use of the term “apocalypse,” a characterization that is bandied about a bit too frequently these days. 2020 was a pretty bad year, indeed, but it was hardly apocalyptic. Not even close. Despite the twin horrors of Trump and the pandemic, we have had other years that were far worse. Think 1812, when the British burned Washington and sent the president fleeing for his life. And 1862, with tens of thousands already lying dead on Civil War battlefields as the Union army suffered a series of reverses. And 1942, still in the throes of economic depression, with Germany and Japan lined up against us. And 1968, marked by riots and assassinations, when it truly seemed that the nation was unraveling from within. Going forward, climate change may certainly breed apocalypse. So might a cornered Putin, equipped with an arsenal of nuclear weapons and diminishing options as Russian forces in the field teeter on collapse. But 2020 is already in the rear-view mirror. It will no doubt leave a mark upon us, but as we move on, it spins ever faster into our past. At the same time, predicting the future, even when armed with the best data, is fraught with unanticipated obstacles, and grand strategies almost always lead to failure. It remains our duty to study our history while we engage with our present. Apocalyptic or not, it’s all we’ve got …

I have reviewed other Bacevich books here:

Review of: Breach of Trust: How Americans Failed Their Soldiers and Their Country, by Andrew J. Bacevich

Review of: America’s War for the Greater Middle East: A Military History, by Andrew J. Bacevich

 

 

Review of: 1957: The Year That Launched the American Future, by Eric Burns

On October 4, 1957, the Soviet Union sent geopolitical shock waves across the planet with the launch of Sputnik 1, the first artificial Earth satellite. Sputnik was only twenty-three inches in diameter, transmitted radio signals for a mere twenty-one days, then burned up on reentry just three months after first achieving orbit, but it literally changed everything. Not only were the dynamics of the Cold War permanently altered by what came to be dubbed the “Space Race,” but the success of Sputnik ushered in a dramatic new era for developments in science and technology. I was not quite six months old.

America was to later win that race to the moon, but despite its fearsome specter as a diabolical power bent on world domination, the USSR turned out to be a kind of vast Potemkin Village that almost noiselessly went out of business at the close 1991. The United States had pretty much lost interest in space travel by then, but that was just about the time that the next critical phase in the emerging digital age—widespread public access to personal computers and the internet—first wrought the enormous changes upon the landscape of American life that today might have Gen Z “zoomers” considering 1957 as something like a date out of ancient times.

And now, as this review goes to press—in yet still one more recycle of Mark Twain’s bon mot “History Doesn’t Repeat Itself, but It Often Rhymes”—NASA temporarily scrubbed the much anticipated blastoff of lunar-bound Artemis I, but a real space race is again fiercely underway, although this time the rivals include not only Russia, but China and a whole host of billionaires, at least one of whom could potentially fit a template for a “James Bond” style villain. And while all this is going on, I recently registered for Medicare.

Sixty-five years later, there’s a lot to look back on. In 1957: The Year That Launched the American Future (2020), a fascinating, fast-paced chronicle manifested by articulately rendered, thought-provocative chapter-length essays, author and journalist Eric Burns reminds us of what a pivotal year that proved to be, not only by kindling that first contest to dominate space, but in multiple other arenas of the social, political, and cultural, much that is only apparent in retrospect.

That year, while Sputnik stoked alarms that nuclear-armed Russians would annihilate the United States with bombs dropped from outer space, tabloid journalism reached what was then new levels of the outrageous exploiting “The Mad Bomber of New York,” who turned out to be a pathetic little fellow whose series of explosives actually claimed not a single fatality. In another example of history’s unintended consequences, a congressional committee investigating illegal labor activities helped facilitate Jimmy Hoffa’s takeover of the Teamsters. A cloak of mystery was partially lifted from organized crime activities with a very public police raid at Apalachin that rounded up Mafia bosses by the score. The iconic ’57 Chevy ruled the road and cruised on newly constructed interstate highways that would revolutionize travel as well as wreak havoc on cityscapes. African Americans remained second-class citizens but struggles for equality ignited a series of flashpoints. In September 1957, President Eisenhower federalized the Arkansas National Guard and sent Army troops to Little Rock to enforce desegregation. That same month, Congress passed the Civil Rights Act of 1957, watered-down yet still landmark legislation that paved the way for more substantial action ahead. Published that year were Jack Kerouac’s On the Road and Nevil Shute’s On the Beach. Michael Landon starred in I Was a Teenage Werewolf. Little Richard, who claimed to see Sputnik while performing in concert and took it as a message from God, abruptly walked off stage and abandoned rock music to preach the word of the Lord. But the nation’s number one hit was Elvis Presley’s All Shook Up; rock n’ roll was here to stay.

Burns’ commentary on all this and more is engaging and generally a delight to read, but 1957 is by no means a comprehensive history of that year. In fact, it is a stretch to term this book a history at all except in the sense that the events it describes occurred in the past. Instead, it is rather a subjective collection of somewhat loosely linked commentaries that spotlight specific events and emerging trends that the author identifies as formative for the nation we would become in decades that followed. As such, the book succeeds due to Burn’s keen sense of how both key episodes as well as more subtle cultural waves influenced a country in transition from the conventional, consensus-driven postwar years to the radicalized, tumultuous times that lay just ahead.

His insight is most apparent in his cogent analysis of how Civil Rights advanced not only through lunch-counter sit-ins and a reaction that was marked by violent repression, but by cultural shifts among white Americans—and that rock n’ roll had at least some role in this evolution of outlooks. At the same time, his conservative roots are exposed in his treatment of On the Road and the rise of the “Beat generation;” Burns genuinely seems as baffled by their emergence as he is amazed that anyone could praise Kerouac’s literary talents. But, to his credit, he recognizes the impact the novel has upon a national audience that no longer could confidently boast of a certainty in its destiny. And it is Burns’ talent with a pen that captivates a markedly different audience, some sixty-five years later.

In the end, the author leaves us yearning for more. After all, other than references that border on the parenthetical to Richard Nixon, Robert F. Kennedy, and Dag Hammarskjöld, there is almost no discussion of national politics or international relations, essential elements in any study of a nation at what the author insists is at a critical juncture. Even more problematic, very conspicuous in its absence is the missing chapter that should have been devoted to television. In 1950, 3.9 million TV sets were in less than ten percent of American homes. By 1957, that number increased roughly tenfold to 38.9 million TVs in the homes of nearly eighty percent of the population! That year, I Love Lucy aired its final half-hour episode, but in addition to network news, families were glued to their black-and-white consoles watching Gunsmoke, Alfred Hitchcock, Lassie, You Bet Your Life, and Red Skelton. For the World War II generation, technology that brought motion pictures into their living rooms was something like miraculous. Nothing was more central to the identity of the life of the average American in 1957 than television, but Burns inexplicably ignores it.

Other than Sputnik, which clearly marked a turning-point for science and exploration, it is a matter of some debate whether 1957 should be singled out for demarcation as the start of a new era. One could perhaps argue instead for the election of John F. Kennedy in 1960, or with even greater conviction, for the date of his assassination in 1963, as a true crossroads of sorts for the past and future United States. Still, if for no other reason than the conceit that this was my birth year, I am willing to embrace Burns’ thesis that 1957 represented a collective critical moment for us all. Either way, his book promises an impressive tour of a time that seems increasingly more distant with the passing of each and every day.

 

 

Review of: Bogart, by A.M. Sperber & Eric Lax

Early in 2022, I saw Casablanca on the big screen for the first time, the 80th anniversary of its premiere. Although over the years I have watched it in excess of two dozen times, this was a stunning, even mesmerizing experience for me, not least because I consider Casablanca the finest film of Old Hollywood—this over the objections of some of my film-geek friends who would lobby for Citizen Kane in its stead. Even so, most would concur with me that its star, Humphrey Bogart, was indeed the greatest actor of that era.

Attendance was sparse, diminished by a resurgence of COVID, but I sat transfixed in that nearly empty theater as Bogie’s distraught, drunken Rick Blaine famously raged that “Of all the gin joints in all the towns in all the world, she walks into mine!” He is, of course, lamenting his earlier unexpected encounter with old flame Ilsa Lund, splendidly portrayed with a sadness indelibly etched upon her beautiful countenance by Ingrid Bergman, who with Bogart led the credits of a magnificent ensemble cast that also included Paul Henreid, Claude Rains, Conrad Veidt, Sydney Greenstreet, and Peter Lorre. But Bogie remains the central object of that universe; the plot and the players in orbit about him. There’s no doubt that without Bogart, there could never have been a Casablanca as we know it. Such a movie might have been made, but it could hardly have achieved a greatness on this order of magnitude.

Bogie never actually uttered the signature line “Play it again, Sam,” so closely identified with the production (and later whimsically poached by Woody Allen for the title of his iconic 1972 comedy peppered with clips from Casablanca). And although the film won Academy Awards for Best Picture and Best Director, as well as in almost every other major category, Bogart was nominated but missed out on the Oscar, which instead went to Paul Lukas—does anyone still remember Paul Lukas?—for his role in Watch on the Rhine. This turns out to be a familiar story for Bogart, who struggled with a lifelong frustration at typecasting, miscasting, studio manipulation, lousy roles, inadequate compensation, missed opportunities, and repeated snubs—public recognition of his talent and star-quality came only late in life and even still frequently eluded him, as on that Oscar night. He didn’t really expect to win, but we can yet only wonder at what Bogart must have been thinking . . . He was already forty-four years old on that disappointing evening when the Academy passed him over. There was no way he could have known that most of his greatest performances would lie ahead, that after multiple failed marriages (one still unraveling that very night) a young starlet he had only just met would come to be the love of his life and mother of his children, and that he would at last achieve not only the rare brand of stardom reserved for just a tiny slice of the top tier in his profession, but that he would go on become a legend in his own lifetime and well beyond it: the epitome of the cool, tough, cynical guy who wears a thin veneer of apathy over an incorruptible moral center, women swooning over him as he stares down villains, an unlikely hero that every real man would seek to emulate.

My appreciation of Casablanca and its star in this grand cinema setting was enhanced by the fact that I was at the time reading Bogart (1997), by A.M. Sperber & Eric Lax, which is certainly the definitive biography of his life. I was also engaged in a self-appointed effort to watch as many key Bogie films in roughly chronological order as I could while reading the bio, which eventually turned out to be a total of twenty movies, from his first big break in The Petrified Forest (1936) to The Harder They Fall (1956), his final role prior to his tragic, untimely death at fifty-seven from esophageal cancer.

Bogie’s story is told brilliantly in this unusual collaboration by two authors who had never actually met. Ann Sperber, who wrote a celebrated biography of journalist Edward R. Murrow, spent seven years researching Bogart’s life and conducted nearly two hundred interviews with those who knew him most intimately before her sudden death in 1994. Biographer Eric Lax stepped in and shaped her draft manuscript into a coherent finished product that reads seamlessly like a single voice. I frequently read biographies of American presidents not only to study the figure that is profiled, but because the very best ones serve double duty as chronicles of United States history, the respective president as the focal point. I looked to the Bogart book for something similar, in this case a study of Old Hollywood with Bogie in the starring role. I was not to be disappointed.

Humphrey DeForest Bogart was born on Christmas Day 1899 in New York City to wealth and privilege, with a father who was a cardiopulmonary surgeon and a mother who was a commercial illustrator. Both parents were distant and unaffectionate. They had an apartment on the Upper West side and a vast estate on Canandaigua Lake in upstate New York, where Bogie began his lifelong love affair with boating. Indifferent to higher education, he eventually flunked out of boarding school and joined the navy. There seems nothing noteworthy about his early life.

His acting career began almost accidentally, and he spent several years on the stage before making his first full-length feature in 1930, Up the River, with his drinking buddy Spencer Tracy, who called him “Bogie.” He was already thirty years old. What followed were largely lackluster roles on both coasts, alternating between Broadway theaters and Hollywood studios. He was frequently broke, drank heavily, and his second marriage was crumbling. Then he won rave reviews as escaped murderer Duke Mantee in The Petrified Forest, playing opposite Leslie Howard on the stage. The studio bought the rights, but characteristically for Bogie, they did not want to cast him to reprise his role, looking instead for an established actor, with Edward G. Robinson at the top of the list. Then Howard, who had production rights, stepped in to demand Bogart get the part. The 1936 film adaptation of the play, which also featured a young Bette Davis, channeled Bogart’s dark and chillingly realistic portrayal of a psychopathic killer—in an era when gangsters like Dillinger and Pretty Boy Floyd dominated the headlines—and made Bogie a star.

But again he faced a series of let-downs. This was the era of the studio system, with actors used and abused by big shots like Jack Warner, who locked Bogart into a low-paid contract that tightly controlled his professional life, casting him repeatedly in virtually  interchangeable gangster roles in a string of B-movies. It wasn’t until 1941, when he played  Sam Spade in The Maltese Falcon—quintessential film noir as well as John Huston’s directorial debut—that Bogie joined the ranks of undisputed A-list stars and began the process of taking revenge on the studio system by commanding greater compensation and demanding greater control of his screen destiny. But in those days, despite his celebrity, that remained an uphill battle.

I began watching his films while reading the bio as a lark, but it turned out to be an essential assignment: you can’t read about Bogie without watching him. Many of the twenty that I screened I had seen before, some multiple times, but others were new to me. I was raised by my grandparents in the 1960s with a little help from a console TV in the livingroom and all of seven channels delivered via rooftop antenna. When cartoons, soaps, and prime time westerns and sitcoms weren’t broadcasting, the remaining airtime was devoted to movies. All kinds of movies, from the dreadful to the superlative and everything in-between, often on repeat. Much of it was classic Hollywood and Bogart made the rounds. One of my grandfather’s favorite flicks was The Treasure of the Sierra Madre, and I can recall as a boy watching it with him multiple times. In general, he was a lousy parent, but I am grateful for that gift; it remains among my top Bogie films. We tend to most often think of Bogart as Rick Blaine or Philip Marlowe, but it is as Fred C. Dobbs in The Treasure of the Sierra Madre and Charlie Allnutt in The African Queen and Captain Queeg in The Caine Mutiny that the full range of his talent is revealed.

It was hardly his finest role or his finest film, but it was while starring as Harry Morgan in To Have and Have Not (1944) that Bogie met and fell for his co-star, the gorgeous, statuesque, nineteen-year-old Lauren Bacall—twenty-five years younger than him—spawning one of Hollywood’s greatest on-screen, off-screen romances. They would be soulmates for the remainder of his life, and it was she who brought out the very best of him. Despite his tough guy screen persona, the real-life Bogie tended to be a brooding intellectual who played chess, was well-read, and had a deeply analytical mind. An expert sailor, he preferred boating on the open sea to carousing in bars, although he managed to do plenty of both. During crackdowns on alleged communist influence in Hollywood, Bogart and Bacall together took controversial and sometimes courageous stands against emerging blacklists and the House Un-American Activities Committee (HUAC). But he also had his flaws. He could be cheap. He could be a mean drunk. He sometimes wore a chip on his shoulder carved out of years of frustration at what was after all a very slow rise to the top of his profession.  But warts and all, far more of his peers loved him than not.

Bogart is a massive tome, and the first section is rather slow-going because Bogie’s early life was just so unremarkable. But it holds the reader’s interest because it is extremely well-written, and it goes on to succeed masterfully in spotlighting Bogart’s life against the rich fabric that forms the backdrop of that distant era of Old Hollywood before the curtains fell for all time.  If you are curious about either, I highly recommend this book. If you are too busy for that, at the very least carve out some hours of screen time and watch Bogie’s films. You will not regret the time spent. Although his name never gets dropped in the lyrics by Ray Davies for the familiar Kinks tune, if there were indeed Celluloid Heroes, the greatest among them was certainly Humphrey Bogart.

 

NOTE: These are Bogart films I screened while reading this book:

The Petrified Forest (1936)

Dead End (1937)

High Sierra (1941)

The Maltese Falcon (1941)

Across the Pacific (1942)

Casablanca (1942)

Passage to Marseille (1944)

To Have and Have Not (1944)

The Big Sleep (1946)

Dark Passage (1947)

Dead Reckoning (1947)

Treasure of the Sierra Madre (1948)

Key Largo (1948)

In a Lonely Place (1950)

The African Queen (1951)

Beat the Devil (1953)

The Caine Mutiny (1954)

Sabrina (1954)

The Desperate Hours (1955)

The Harder They Fall (1956)

 

Review of: A Self-Made Man: The Political Life of Abraham Lincoln Vol. I, 1809–1849, by Sidney Blumenthal

Historians consistently rank him at the top, tied with Washington for first place or simply declared America’s greatest president. His tenure was almost precisely synchronous with the nation’s most critical existential threat: his very election sparked secession, first shots fired at Sumter a month after his inauguration, the cannon stilled at Appomattox a week before his murder.  There were still armies in the field, but he was gone, replaced by one of the most sinister men to ever take the oath of office, leaving generations of his countrymen to wonder what might have transpired with all the nation’s painful unfinished business had he survived, to the trampled hopes for equality for African Americans to the promise of a truly “New South” that never emerged.  A full century ago, decades after his death, he was reimagined as an enormous, seated marble man with the soulful gaze of fixed purpose, the central icon in his monument that provokes tears for so many visitors that stand in awe before him. When people think of Abraham Lincoln, that’s the image that usually springs to mind.

The seated figure rises to a height of nineteen feet; somebody calculated that if it stood up it would be some twenty-eight feet tall. The Lincoln that once walked the earth was not nearly that gargantuan, but he was nevertheless a giant in his time: physically, intellectually—and far too frequently overlooked—politically! He sometimes defies characterization because he was such a character, in so very many ways.

An autodidact gifted with a brilliant analytical mind, he was also a creature of great integrity loyal to a firm sense of a moral center that ever evolved when polished by new experiences and touched by unfamiliar ideas. A savvy politician, he understood how the world worked. He had unshakeable convictions, but he was tolerant of competing views. He had a pronounced sense of empathy for others, even and most especially his enemies. In company, he was a raconteur with a great sense of humor given to anecdotes often laced with self-deprecatory wit. (Lincoln, thought to be homely, when accused in debate of being two-faced, self-mockingly replied: “I leave it to my audience. If I had another face, do you think I’d wear this one?”) But despite his many admirable qualities, he was hardly flawless. He suffered with self-doubt, struggled with depression, stumbled through missteps, burned with ambition, and was capable of hosting a mean streak that loomed even as it was generally suppressed. More than anything else he had an outsize personality.

And Lincoln likewise left an outsize record of his life and times! So why has he generally posed such a challenge for biographers? Remarkably, some 15,000 books have been written about him—second, it is said, only to Jesus Christ—but yet in this vast literature, the essence of Lincoln again and again somehow seems out of reach to his chroniclers. We know what he did and how he did it all too well, but portraying what the living Lincoln must have been like has remained frustratingly elusive in all too many narratives. For instance, David Herbert Donald’s highly acclaimed bio—considered by many the best single volume treatment of his life—is indeed impressive scholarship but yet leaves us with a Lincoln who is curiously dull and lifeless. Known for his uproarious banter, the guy who joked about being ugly for political advantage is glaringly absent in most works outside of Gore Vidal’s Lincoln, which superbly captures him but remains, alas, a novel not a history.

All that changed with A Self-Made Man: The Political Life of Abraham Lincoln Vol. I, 1809–1849, by Sidney Blumenthal (2016), an epic, ambitious, magnificent contribution to the historiography that demonstrates not only that despite the thousands of pages written about him there still remains much to say about the man and his times, but even more significantly that it is possible to brilliantly recreate for readers what it must have been like to engage with the flesh and blood Lincoln. This is the first in a projected five-volume study (two subsequent volumes have been published to date) that—as the subtitle underscores—emphasize the “political life” of Lincoln, another welcome contribution to a rapidly expanding genre focused upon politics and power, as showcased in such works as Jon Meacham’s Thomas Jefferson: The Art of Power, Robert Dallek’s Franklin D. Roosevelt: A Political Life, and George Washington: The Political Rise of America’s Founding Father, by David O. Stewart.

At first glance, this tactic might strike as surprising, since prior to his election as president in 1860 Lincoln could boast of little in the realm of public office beyond service in the Illinois state legislature and a single term in the US House of Representatives in the late 1840s. But, as Blumenthal’s deeply researched and well-written account reveals, politics defined Lincoln to his very core, inextricably manifested in his life and character from his youth onward, something too often disregarded by biographers of his early days. It turns out that Lincoln was every bit a political animal, and there is a trace of that in nearly every job he ever took, every personal relationship he ever formed, and every goal he ever chased.

This approach triggers a surprising epiphany for the student of Lincoln. It is as if an entirely new dimension of the man has been exposed for the first time that lends new meaning to words and actions previously treated superficially or—worse—misunderstood by other biographers. Early on, Blumenthal argues that Donald and others have frequently been misled by Lincoln’s politically crafted utterances that cast him as marked by passivity, too often taking him at his word when a careful eye on the circumstances demonstrates the exact opposite. In contrast, Lincoln, ever maneuvering, if quietly, could hardly be branded as passive [p9]. Given this perspective, the life and times of young Abe is transformed into something far richer and more colorful than the usual accounts of his law practice and domestic pursuits. In another context, I once snarkily exclaimed  “God save us from The Prairie Years” because I found Lincoln’s formative period—and not just Sandburg’s version of it—so uninteresting and unrelated to his later rise. Blumenthal has proved me wrong, and that sentiment deeply misplaced.

But Blumenthal not only succeeds in fleshing out a far more nuanced portrait of Lincoln—an impressive accomplishment on its own—but in the process boldly sets out to do nothing less than scrupulously detail the political history of the United States in the antebellum years from the Jackson-Calhoun nullification crisis onward.  Ambitious is hardly an adequate descriptive for the elaborate narrative that results, a product of both prodigious research and a very talented pen. Scores of pages—indeed whole chapters—occur with literally no mention of Lincoln at all, a striking technique that is surprisingly successful; while Lincoln may appear conspicuous in his absence, he is nevertheless present, like the reader a studious observer of these tumultuous times even when he is not directly engaged, only making an appearance when the appropriate moment beckons.  As such, A Self-Made Man is every bit as much a book of history as it is biography, a key element to the unstated author’s thesis: that it is impossible to truly get to know Lincoln—especially the political Lincoln—except in the context and complexity of his times, a critical emphasis not afforded in other studies.

And there is much to chronicle in these times. Some of this material is well known, even if until recently subject to faulty analysis.  The conventional view of the widespread division that characterized the antebellum period centered on a sometimes-paranoid south on the defensive, jealous of its privileges, in fear of a north encroaching upon its rights. But in keeping with the latest historiography, Blumenthal deftly highlights how it was that, in contrast, the slave south—which already wielded a disproportionate share of national political power due to the Constitution’s three-fifths clause that inflated its representation—not only stifled debate on  slavery but aggressively lobbied for its expansion. And just as a distinctly southern political ideology evolved its notion of the peculiar institution from the “wolf by the ear” necessary evil of Jefferson’s time to a vaunted hallmark of civilization that boasted benefit to master and servant, so too did it come to view the threat of separation less in dread than anticipation. The roots of all that an older Lincoln would witness severing the ancient “bonds of affection” of the then no longer united states were planted in these, his early years.

Other material is less familiar. Who knew how integral to Illinois politics—for a time—was the cunning Joseph Smith and his Mormon sect?  Or that Smith’s path was once entangled with the budding career of Stephen A. Douglas? Meanwhile, the author sheds new light on the long rivalry between Lincoln and Douglas, which had deep roots that went back to the 1830s, decades before their celebrated clash on the national stage brought Lincoln to a prominence that finally eclipsed Douglas’s star.

Blumenthal’s insight also adeptly connects the present to the past, affording a greater relevance for today’s reader.  He suggests that the causes of the financial crisis of 2008 were not all that dissimilar to those that drove the Panic of 1837, but rather than mortgage-backed securities and a housing bubble, it was the monetization of human beings as slave property that leveraged enormous fortunes that vanished overnight when an oversupply of cotton sent market prices plummeting, which triggered British banks to call in loans on American debtors—a cotton bubble that burst spectacularly (p158-59). This point can hardly be overstated, since slavery was not only integral to the south’s economy, but by the eve of secession human property was to represent the largest single form of wealth in the nation, exceeding the combined value of all American railroads, banks, and factories. A cruel system that assigned values to men, women, and children like cattle had deep ramifications not only for masters who acted as “breeders” in the Chesapeake and markets in the deep south, but also for insurance companies in Hartford, textile mills in Lowell, and banks in London.

Although Blumenthal does not himself make this point, I could detect eerie if imperfect parallels to the elections of 2016 and 1844, with Lincoln seething as the perfect somehow became the enemy of the good. In that contest, Whig Henry Clay was up against Democrat James K. Polk. Both were slaveowners, but Clay opposed the expansion of slavery while Polk championed it. Antislavery purists in New York rejected Clay for the tiny Liberty Party, which by a slender margin tipped the election to Polk, who then boosted the slave power with Texas annexation, and served as principal author of the Mexican War that added vast territories to the nation, setting forces in motion that later spawned secession and Civil War. Lincoln was often prescient, but of course he could not know all that was to follow when, a year after Clay’s defeat, he bitterly denounced the “moral absolutism” that led to the “unintended tragic consequences” of Polk’s elevation to the White House (p303). To my mind, there was an echo of this in the 2016 disaster that saw Donald Trump prevail, a victory at least partially driven by those unwilling to support Hillary Clinton who—despite the stakes—threw away their votes on Jill Stein and Gary Johnson.

No review could properly summarize the wealth of the material contained here, nor overstate the quality of the presentation, which also suggests much promise for the volumes that follow. I must admit that at the outset I was reluctant to read yet another book about Lincoln, but A Self-Made Man was recommended to me by no less than historian Rick Perlstein, (author of Nixonland), and like Perlstein, Blumenthal’s style is distinguished by animated prose bundled with a kind of uncontained energy that frequently delivers paragraphs given to an almost breathless exhale of ideas and people and events that expertly locates the reader at the very center of concepts and consequences. The result is something exceedingly rare for books of history or biography: a page-turner! Whether new to studies of Lincoln or a long-time devotee, this book should be required reading.

 

A review of one of Rick Perlstein’s books is here: Review of: Nixonland: The Rise of a President and the Fracturing of America, by Rick Perlstein

I reviewed the subsequent two volumes in Blumenthal’s Lincoln series here: Review of: Wrestling With His Angel: The Political Life of Abraham Lincoln Vol. II, 1849-1856, and All the Powers of Earth: The Political Life of Abraham Lincoln Vol. III, 1856-1860, by Sidney Blumenthal

 

Review of: Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine, by Jim Downs

As the COVID-19 pandemic swept the globe in 2020, it left in its wake the near-paralysis of many hospital systems, unprepared and unequipped for the waves of illness and death that suddenly overwhelmed capacities for treatment that were after all at best only palliative care, since for this deadly new virus there was neither a cure nor a clear route to prevention. Overnight, epidemiologists—scrambling for answers or even just clues—became the most critically significant members of the public health community, even if their informed voices were often shouted down by the shriller ones of media pundits and political hacks.

Meanwhile, data collection began in earnest and the number of data dashboards swelled. In the analytical process, the first stop was identifying the quality of the data and the disparities in how data was collected. Was it true, as some suggested, that a disproportionate number of African Americans were dying from COVID? At first, there was no way to know since some states were not collecting data broken down by this kind of specific demographic. Data collection eventually became more standardized, more precise, and more reliable, serving as a key ingredient to combat the spread of this highly contagious virus, as well as one of the elements that guided the development of vaccines.  Even so, dubious data and questionable studies too often took center stage both at political rallies and in the media circus that echoed a growing polarization that had one side denouncing masks, resisting vaccination, and touting sideshow magic bullets like Ivermectin. But talking heads and captive audiences aside, masks reduce infection, vaccines are effective, and dosing with Ivermectin is a scam. How do we know that? Data. Mostly due to data. Certainly, other key parts of the mix include scientists, medical professionals, case studies, and peer reviewed papers, but data—first collected and then analyzed—is the gold standard, not only for COVID but for all disease treatment and prevention.

But it wasn’t always that way.

In the beginning, there was no such thing as epidemiology. Disease causes and treatments were anecdotal, mystical, or speculative.  Much of the progress in science and medicine that was the legacy of the classical world had long been lost to the west. The dawn of modern epidemiology rose above a horizon constructed of data painstakingly collected and compiled and subsequently analyzed. In fact, certain aspects of the origins of epidemiology were to run concurrent with the evolution of statistical analysis. In the early days, as the reader comes to learn in this brilliant and groundbreaking 2021 work by historian Jim Downs, Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine, the bulk of the initial data was derived from unlikely and unwilling participants who existed at the very margins: the enslaved, the imprisoned, the war-wounded, and the destitute condemned to the squalor of public hospitals. Their identities are mostly forgotten, or were never recorded in the first place, but yet collectively the data harvested from them was to provide the skeletal framework for the foundation of modern medicine.

In a remarkable achievement that could hardly be more relevant today, the author cleverly locates Maladies of Empire at the intersection of history and medicine, where data collection from unexpected and all too frequently wretched subjects comes to form the very basis of epidemiology itself. It is these early stories that send shudders to a modern audience. Nearly everyone is familiar with the wrenching 1787 diagram of the lower deck of the slave ship Brookes, where more than four hundred fifty enslaved human beings were packed like sardines for a months-long voyage, which became an emblem for the British antislavery movement. But, as Downs points out, few are aware that the sketch can be traced to the work of British naval surgeon Dr. Thomas Trotter, one of the first to recognize that poor ventilation in crowded conditions results in a lack of oxygen that breeds disease and death. His observations also led to a better understanding of how to prevent scurvy, a frequent cause of higher mortality rates among the seaborne citrus-deprived. Trotter himself was appalled by the conditions he encountered on the Brookes, and testified to this before the House of Commons. But that was hardly the case for many of his peers, and certainly not for the owners of slave ships, who looked past the moral dilemmas of a Trotter while exceedingly grateful for his insights; after all, the goal was keep larger quantities of their human cargo alive in order to turn greater profits. Dead slaves lack market value.

A little more than three decades prior to Trotter’s testimony, the critical need for ventilation was documented by another physician in the wake of the confinement of British soldiers in the infamous “Black Hole of Calcutta” during the revolution in Bengal, which resulted in the death by suffocation of the majority of the captives.  Downs makes the point that one of the unintended consequences of colonialism was that for early actors in the medical arena it served to vastly extend the theater of observation of the disease-afflicted to a virtually global stage that hosted the byproducts of colonialism: war, subjugated peoples, the slave trade, military hospitals and prisons. But it turns out that the starring roles belong less to the doctors and nurses that receive top billing in the history books than to the mostly uncredited bit players removed from the spotlight: the largely helpless and disadvantaged patients whose symptoms and outcomes were observed and cataloged, whose anonymous suffering translated into critical data that collectively advanced the emerging science of epidemiology.

Traditionally, history texts rarely showcased notable women, but one prominent exception was Florence Nightingale, frequently extolled for her role as a nurse during the Crimean War. But as underscored in Maladies of Empire, Nightingale’s real if often overlooked legacy was as a kind of disease statistician through her painstaking data collection and analysis—the very basis for epidemiology that was generally credited to white men rather than to “women working in makeshift hospitals.” [p111] But it was the poor outcomes for patients typically subjected to deplorable conditions in these makeshift military hospitals—which Nightingale assiduously observed and recorded—that drew attention to similarly appalling environments in civilian hospitals in England and the United States, which led to a studied analysis that eventually established systematic evidence for the causes, spread, and treatment of disease.

The conclusions these early epidemiologists reached were not always accurate. In fact, they were frequently wrong. But Downs emphasizes that what was significant was the development of the proper analytical framework. In these days prior to the revolutionary development of germ theory, notions on how to improve survival rates of the stricken put forward by Nightingale and others were controversial and often contradictory. Was the best course quarantine, a frequent resort? Or would improving the sickbed conditions, as Nightingale advocated, lead to better outcomes? Unaware of the role of germs in contagion, evidence could be both inconclusive and inconsistent, and competing ideas could each be partly right. After all, regardless of how disease spread, cleaner and better ventilated facilities might lead to lower mortality rates. Nightingale stubbornly resisted germ theory, even as it was widely adopted, but after it won her grudging acceptance, she continued to promote more sanitary hospital conditions to improve survival rates. Still, epidemiologists faced difficult challenges with diseases that did not conform to familiar patterns, such as cholera, spread by a tainted water supply, and yellow fever, a mosquito-borne pathogen.

In the early days, as noted, European observers collected data from slave ships, yet it never occurred to them that because their human subjects were black such evidence was not applicable to the white population. But epidemiology took a surprisingly different course in the United States, where race has long proved to be a defining element.  Of the more than six hundred thousand who lost their lives during the American Civil War, about two-thirds were felled not by bullets but by disease. The United States Sanitary Commission (USSC) was established in an attempt to ameliorate these dreadful outcomes, but its achievements on one hand were undermined on the other by an obsession with race, even going so far as the sending out to “. . . military doctors a questionnaire, ‘The Physiological Status of the Negro,’ whose questions were based on the belief that Black soldiers were innately different from white soldiers . . . The questionnaire also distinguished gradations of color among Black soldiers, asking doctors to compare how ‘pure Negroes’ differed from people of ‘mixed races’ and to describe ‘the effects of amalgamation on the vital endurance and vigor of the offspring.’” With its imprimatur of governmental authority, the USSC officially championed scientific racism, with profound and long-term social, political, and economic consequences for African Americans. [p134-35]

Some of these notions can be traced back to the antebellum musings of Alabama surgeon Josiah Nott—made famous after the war when he correctly connected mosquitoes to the etiology of Yellow Fever—who asserted that blacks and whites were members of separate species whose mixed-race offspring he deemed “hybrids” who were “physiologically inferior.” Nott believed that all three of these distinct “types” responded differently to disease. [p124-25] His was but one manifestation of the once widespread pseudoscience of physiognomy that alleged black inferiority in order to justify first slavery and later second-class citizenship. Such ideas persisted for far too long, and although scientific racism still endures on the alt-right, it has been thoroughly discredited by actual scientists.  It turns out that a larger percentage of African Americans did indeed succumb to death in the still ongoing COVID pandemic, but this has been shown to be due to factors of socioeconomic status and lack of access to healthcare, not genetics.

Still, although deemed inferior, enslaved blacks also proved useful when convenient. The author argues that “… enslaved children were most likely used as the primary source of [smallpox] vaccine matter in the Civil War South,” despite the danger of infection in harvesting lymph from human subjects in order to vaccinate Confederate soldiers in the field. In yet one more reminder of the moral turpitude that defined the south’s “peculiar institution,” the subjects also included infants whose resulting scar or pit, Downs points out,    “. . . would last a lifetime, indelibly marking a deliberate infection of war and bondage. Few, if any, knew that the scars and pit marks actually disclosed the infant’s first form of enslaved labor, an assignment that did not make it into the ledger books or the plantation records.” [p141-42]

Tragically, this episode was hardly an anomaly, and unethical medical practices involving blacks did not end with Appomattox. The infamous “Tuskegee Syphilis Study” that observed but failed to offer treatment to the nearly four hundred black men recruited without informed consent ran for forty years and was not terminated until 1972! One of the chief reasons for COVID vaccine hesitancy among African Americans has been identified as a distrust of a medical community that historically has either victimized or marginalized them.

Maladies of Empire is a well-written, highly readable book suitable to a scholarly as well as popular audience, and clearly represents a magnificent contribution to the historiography. But it is hardly only for students of history. Instead, it rightly belongs on the shelf of every medical professional practicing today—especially epidemiologists!

Review of: Chemistry for Breakfast: The Amazing Science of Everyday Life, by Mai Thi Nguyen-Kim

Is your morning coffee moving? Is there a particle party going on in your kitchen? What makes for a great-tasting gourmet meal? Does artificial flavoring really make a difference? Why does mixing soap with water get your dishes clean? Why do some say that “sitting is the new smoking?” How come one beer gives you a strong buzz but your friend can drink a bottle of wine without slurring her words?  When it comes to love, is the “right chemistry” just a metaphor? And would you dump your partner because he won’t use fluoridated toothpaste?

All this and much more makes for the delightful conversation packed into Chemistry for Breakfast: The Amazing Science of Everyday Life, by Mai Thi Nguyen-Kim, a fun, fascinating, and fast-moving slender volume that could very well turn you into a fan of—of all things—chemistry! This cool and quirky book is just the latest effort by the author—a real-life German chemist who hosts a YouTube channel and has delivered a TED Talk—to combat what she playfully dubs “chemism:” the notion that chemistry is dull and best left to the devices of boring nerdy chem-geeks! One reason it works is because Nguyen-Kim is herself the antithesis of such stereotypes, coming off in both print and video as a hip, brilliant, and articulate young woman with a passion for science and for living in the moment.

I rarely pick up a science book, but when I do, I typically punch above my intellectual weight, challenging myself to reach beyond my facility with history and literature to dare to tangle with the intimidating realms of physics, biology, and the like. I often emerge somewhat bruised but with the benefit of new insights, as I did after my time with Sean Carroll’s The Particle at the End of the Universe and Bill Schopf’s Cradle of Life. So it was with a mix of eagerness and trepidation that I approached Chemistry for Breakfast.

But this proved to be a vastly different experience! Using her typical day as a backdrop—from her own body’s release of stress hormones when the alarm sounds to the way postprandial glasses of wine mess with the neurotransmitters of her guests—Nguyen-Kim demonstrates the omnipresence of chemistry to our very existence, and distills its complexity into bite-size concepts that are easy to process but yet never dumbed-down. Apparently, there is a particle party going on in your kitchen every morning, with all kinds of atoms moving at different rates in the coffee you’re sipping, the mug in your hand, and the steam rising above it. It’s all about temperature and molecular bonds.  In a chapter whimsically entitled “Death by Toothpaste,” we find out how chemicals bond to produce sodium fluoride, the stuff of toothpaste, and why that not only makes for a potent weapon against cavities, but why the author’s best buddy might dump her boyfriend—because he thinks fluoride is poison! There’s much more to come—and it’s still only morning at Mai’s house …

As a reader, I found myself learning a lot about chemistry without studying chemistry, a remarkable achievement by the author, whose technique is so effective because it is so unique. Fielding humorous anecdotes plucked from everyday existence, Mai’s wit is infectious, so the “lessons” prove entertaining without turning silly. I love to cook, so I especially welcomed her return to the kitchen in a later chapter. Alas, I found out that while I can pride myself on my culinary expertise, it all really comes down to the way ingredients react with one another in a mixing bowl and on the hot stove.  Oh, and it turns out that despite the fearmongering in some quarters, most artificial flavors are no better or worse than natural ones. Yes, you should read the label—but you have to know what those ingredients are before you judge them healthy or not.

Throughout the narrative, Nguyen-Kim conveys an attractive brand of approachability that makes you want to sit down and have a beer with her, but unfortunately she can’t drink:  Mai, born of Vietnamese parents, has inherited a gene mutation in common with a certain segment of Asians which interferes with the way the body processes alcohol, so she becomes overly intoxicated after just a few sips of any strong drink. She explains in detail why her “broken” ALDH2 enzyme simply will not break down the acetaldehyde in the glass of wine that makes her guests a little tipsy but gives her nausea, a rapid-heartbeat, and sends a “weird, lobster-red tinge” to her face.  Mai’s issue with alcohol reminded me of recent studies that revealed the reason that some people of northern European ancestry always burn instead of tan at the beach is due to faulty genes that block the creation of melanin in response to sun exposure. This is a strong underscore that while race is of course a myth that otherwise communicates nothing of importance about human beings, in the medical world genetics has the potential of serving as a powerful tool to explain and treat disease.  As for Mai, given the overall health risks of alcohol consumption, she views her inability to drink as more of a blessing than a curse, and hopes to pass her broken gene on to her offspring!

The odds that I would ever deliberately set out to read a book about chemistry were never that favorable.  That I would do so and then rave about the experience seemed even more unlikely.  But here we are, along with my highest recommendations. Mai’s love of science is nothing less than contagious. If you read her work,  I can promise that not only will you learn a lot, but you will really enjoy the learning process. And that too, I suppose, is chemistry!

 

[Note: I read an Advance Reader’s Copy of this book as part of an early reviewer’s program]

Review of: The Lost Founding Father: John Quincy Adams and the Transformation of American Politics, by William J. Cooper

Until Jimmy Carter came along, there really was no rival to John Quincy Adams (1767-1848) as best ex-president, although perhaps William Howard Taft earns honorable mention for his later service as Chief Justice of the Supreme Court.  Carter—who at ninety-seven still walks among us as this review goes to press—has made his reputation as a humanitarian outside of government after what many view as a mostly failed single term in the White House. Adams, on the other hand, whose one term as the sixth President of the United States (1825-29) was likewise disappointing, managed to establish a memorable outsize official legacy when he returned to serve his country as a member of the House of Representatives from 1831 until his dramatic collapse at his desk and subsequent death inside the Capitol Building in 1848. Freshman Congressman Abraham Lincoln would be a pallbearer.

Like several of the Founders whose own later presidential years were troubled, including his own father, John Quincy had a far more distinguished and successful career prior to his time as Chief Executive. But quite remarkably, unlike these other men—John Adams, Jefferson, Madison—who lingered in mostly quiet retirement for decades beyond their respective tenures, in his long career John Quincy Adams could be said to have equaled or surpassed his accomplished pre-presidential service as diplomat, United States Senator, and Secretary of State, returning as just a simple Congressman from Massachusetts who was to be a giant in antislavery advocacy. Adams remains the only former president elected to the House, and until George W. Bush in 2001, the only man who could claim his own father as a fellow president.

Notably, the single unsatisfactory terms that he and his father served in the White House turned out to be bookends to a significant era in American history: John Adams was the first to run for president in a contested election (Washington had essentially been unopposed); his son’s tenure ended along with the Early Republic, shattered by the ascent of Jacksonian democracy. But if the Early Republic was no more, it marked only the beginning of another chapter in the extraordinary life of John Quincy Adams. And yet, for a figure that carved such indelible grooves in our nation’s history, present at the creation and active well into the crises of the antebellum period that not long after his death would threaten to annihilate the American experiment, it remains somewhat astonishing how utterly unfamiliar he remains to most citizens of the twenty-first century.

Prominent historian William J. Cooper seeks to remedy that with The Lost Founding Father: John Quincy Adams and the Transformation of American Politics (2017), an exhaustively researched, extremely well-written, if dense study that is likely to claim distinction as the definitive biography for some years to come. Cooper’s impressive work is old-fashioned narrative history at its best. John Quincy Adams is the main character, but his story is told amid the backdrop of the nation’s founding, its evolution as a young republic, and its descent to sectional crises over slavery, while many, at home and abroad, wondered at the likelihood of its survival. It is not only clever but entirely apt that in the book’s title the author dubs his subject the “Lost Founding Father.”

Some have called Benjamin Franklin the “grandfather of his country.” Likewise, John Quincy Adams could be said to be a sort of “grandson.” He was not only to witness the tumultuous era of the American Revolution and observe John Adams’ storied role as a principal Founder, he also accompanied his father on diplomatic missions to Europe while still a boy, and completed most of his early education there.  Like Franklin, Jefferson, and his father, he spent many years abroad during periods of fast-moving events and dramatic developments on American soil that altered the nation and could prove jarring upon return. Unlike the others, his extended absence coincided with his formative years; John Quincy grew up not in New England but rather in France, the Netherlands, Russia, and Great Britain, and this came to deeply affect him.

A brooding intellectual with a brilliant mind who sought solitude over society, dedicated to principle above all else, including loyalty to party, the Adams that emerges in these pages was a socially awkward workaholic subject to depression, blessed with a wide range of talents that ranged from the literary to languages to the deeply analytical, but lacking even the tiniest vestige of charisma.  He strikes the reader as the least suitable person to ever aspire to or serve as president of the United States. A gifted writer, he began a diary when he was twelve years old that he continued almost without interruption until shortly before his death. He frequently expressed dismay at his inability to keep up with his ambitious goals for daily diary entries that often ran to considerable length.

There is much in the man that resembles his father, also a principled intellect, whom he much admired even while he suffered a sense of inadequacy in his shadow. Both men were stubborn in their ideals and tended to alienate those who might otherwise be allies. While each could be self-righteous, John Adams was also ever firmly self-confident in a way that his son could never match. Of course, in his defense, the younger man not only felt obligated to live up to a figure who was a titan in the public arena, but he lacked a wife that was cut from the same cloth as his mother, with whom he had a sometimes-troubled relationship.

Modern historians have made much of the historic partnership that existed, mostly behind the scenes, between John and Abigail Adams; in every way except eighteenth century mores she seems his equal. John Quincy, on the other hand, was wedded to Louisa Catherine, a sickly woman given to fainting spells and frequent migraines whose multiple miscarriages coupled with the loss of an infant daughter certainly triggered severe psychological trauma. A modern audience can’t help but wonder if her many maladies and histrionics were not psychosomatic. At any rate, John Quincy treated his wife and other females he encountered with the patronizing male chauvinism typical of his times, so it is dubious that if he instead found an Abigail Adams at his side, he could have flourished in her orbit the way his father did.

Although Secretary of State John Quincy Adams was largely the force that drove the landmark “Monroe Doctrine” and other foreign policy achievements of the Monroe Administration, most who know of Adams tend to know of him only peripherally, through his legendary political confrontation with the far more celebrated Andrew Jackson. That conflict was forged in the election of 1824. The Federalist Party, scorned for threats of New England secession during the War of 1812, was essentially out of business. James Monroe was wrapping up his second term in what historians have called the “Era of Good Feelings” that ostensibly reflected a sense of national unity controlled by a single party, the Democratic-Republicans, but there were fissures, factions, local interests, and emerging coalitions beneath the surface. In the most contested election to date in the nation’s history, John Quincy, Andrew Jackson, Henry Clay, and William Crawford were chief contenders for the highest office. While Jackson received a plurality, none received a majority of the electoral votes, so as specified in the Constitution the race was sent to the House for decision. Crawford had suffered a devastating stroke and was thus out of consideration. Adams and Clay tended to clash, but both were aligned on many national issues, and Jackson was rightly seen as a dangerous demagogue. Clay threw his support to Adams, who became president. Jackson was furious, even more so when Adams later named Clay Secretary of State, which was then seen as a sure steppingstone to the presidency, something that further enraged Jackson, who branded his appointment by Adams a “Corrupt Bargain.” As it turned out, while Adams prevailed, his presidency was marked by frustration, his ambitious domestic goals stymied by Congress. In a run for reelection, he was dealt a humiliating defeat by Jackson, who headed the new Democratic Party. The politics of John Quincy Adams and the Early Republic went extinct.

While evaluating these two elections, it’s worth pausing here to emphasize John Quincy’s longtime objection to the nefarious if often overlooked impact of the three-fifths clause in the Constitution, which granted southern slaveholding states outsize political clout by counting an enslaved individual as three-fifths of a person for the purpose of representation. This was to prove significant, since the slave south claimed a disproportionate share of national political power when it came to advancing legislation or, for that matter, electing a president.  He found focus on this issue while Secretary of State in the debate that swirled around the Missouri Compromise of 1820, concluding that:

The bargain in the Constitution between freedom and slavery had conveyed to the South far too much political influence, its base the notorious three-fifths clause, which immorally increased southern power in the nation … the past two decades had witnessed a southern domination that had ravaged the Union … he emphasized what he saw as the moral viciousness of that founding accord. It contradicted the fundamental justification of the American Revolution by subjecting slaves to oppression while privileging their masters with about a double representation.  [p174]

This was years before he was himself to fall victim to the infamous clause. As underscored by historian Alan Taylor in his recent work, American Republics (2021), the disputed election of 1824 would have been far less disputed without the three-fifths clause, since in that case Adams would have led Andrew Jackson in the Electoral College 83 to 77 votes, instead of putting Jackson in the lead 99 to 84. When Jackson prevailed in the next election in 1828, it was the south that cemented his victory. The days of Virginia planters in the White House may have passed, but the slave south clearly dominated national politics and often served as antebellum kingmaker for the White House.

In any case, Adams’ dreams of vindicating his father’s single term were dashed.  A lesser man would have gone off into the exile of retirement, but Adams was to come back—and come back stronger than ever as a political figure to be reckoned with, distinguished by his fierce antislavery activism. His abhorrence of human bondage ran deep, and long preceded his return to Congress. And because he kept such a detailed journal, we have insight into his most personal convictions.

Musing once more about the Missouri Compromise, he confided to his diary his belief that a war over slavery was surely on the horizon that would ultimately result in its elimination:  “If slavery be the destined sword in the hand of the destroying angel which is to sever the ties of this Union … the same sword will cut in sunder the bonds of slavery itself.” [p173] He also wrote of his conversations with the fellow cabinet secretary he most admired at the time, South Carolina’s John C. Calhoun, who clearly articulated the doctrine of white supremacy that defined the south. To Adams’ disappointment, Calhoun told him that southerners did not believe the Declaration’s guarantees of universal rights applied to blacks, and “Calhoun maintained that racial slavery guaranteed equality among whites because it placed all of them above blacks.” [p175]

These diary entries from 1820 came to foreshadow the more crisis-driven politics in the decades hence when Adams—his unhappy presidency long behind him—was the leading figure in Congress who stood against the south’s “peculiar institution” and southern domination of national politics. These were, of course, far more fraught times. He opposed both Texas annexation and the Mexican War, which he correctly viewed as a conflict designed to extend slavery. But he most famously led the opposition against the 1836 resolution known as the “gag rule” that prohibited House debate on petitions to abolish slavery, which incensed the north and spawned greater polarization. Adams was eventually successful, and the gag rule was repealed, but not until 1844.

It has long been my goal to read at least one biography of each American president, and I came to Cooper’s book with that objective in mind.  I found my time with it a deeply satisfying experience, although I suspect because it is so pregnant in detail it will find less appeal among a more popular audience.  Still, if you want to learn about this too often overlooked critical figure and at the same time gain a greater understanding of an important era in American history, I would highly recommend that you turn to The Lost Founding Father.

——————————————

Note: I reviewed the referenced Alan Taylor work here: Review of: American Republics: A Continental History of the United States, 1783-1850, by Alan Taylor

 

Review of: Marching Masters: Slavery, Race, and the Confederate Army During the Civil War, by Colin Edward Woodward

Early in the war … a Union squad closed in on a single ragged Confederate, and he obviously didn’t own any slaves. He couldn’t have much interest in the Constitution or anything else. And said: “What are you fighting for, anyhow?” they asked him. And he said: “I’m fighting because you’re down here.” Which is a pretty satisfactory answer.

That excerpt is from Ken Burns’ epic The Civil War (1990) docuseries, Episode 1, “The Cause.” It was delivered by the avuncular Shelby Foote in his soft, reassuring—some might say mellifluous—cadence, the inflection decorated with a pronounced but gentle southern accent. As professor of history James M. Lundberg complains, Foote, author of a popular Civil War trilogy who was himself not a historian, “nearly negates Burns’ careful 15-minute portrait of slavery’s role in the coming of the war with a 15-second” anecdote.  Elsewhere, Foote rebukes the scholarly consensus that slavery was the central cause for secession and the conflict it spawned that would take well over 600,000 American lives.

While all but die-hard “Lost Cause” myth fanatics have relegated Foote’s ill-conceived dismissal of the centrality of slavery to the dustbin of history, the notion that southern soldiers fought solely for home and hearth has long persisted, even among historians. And on the face of it, it seems as if it should be true. After all, secession was the work of a narrow slice of the antebellum south, the slave-owning planter class which only comprised less than two percent of the population but dominated the political elite, in fury that Lincoln’s election by “Free-Soil” Republicans would likely deny their demands to transplant their “peculiar institution” to the new territories acquired in the Mexican War. More critically, three-quarters of southerners owned no slaves at all, and nearly ninety per cent of the remainder owned twenty or fewer. Most whites lived at the margins as yeoman farmers, although their skin color ensured a status markedly above those of blacks, free or enslaved. The Confederate army closely reflected that society: most rebel soldiers were not slaveowners.  So slavery could not have been important to them … or could it?

The first to challenge the assumption that Civil War soldiers, north or south, were political agnostics was James M. McPherson in What They Fought For 1861-1865 (1995). Based on extensive research on letters written home from the front, McPherson argued that most of those in uniform were far more ideological than previously acknowledged. In a magnificent contribution to the historiography, Colin Edward Woodward goes much further in Marching Masters: Slavery, Race, and the Confederate Army During the Civil War (2014), presenting compelling evidence that not only were most gray-clad combatants well-informed about the issues at stake, but a prime motivating force for a majority was to preserve the institution of human chattel bondage and the white supremacy that defined the Confederacy.

Like McPherson, Woodward does a deep dive into the wealth of still extant letters from those at the front to make his case in a deeply researched and well-written narrative that reveals that the average rebel was surprisingly well-versed in the greater issues manifested in the debates that launched an independent Confederacy and justified the blood and treasure being spent to sustain it. And just as in secession, the central focus was upon preserving a society that had its foundation in chattel slavery and white supremacy. Some letters were penned by those who left enslaved human beings—many or just a few—back at home with their families when they marched off to fight, while most were written by poor dirt farmers who had no human property nor the immediate prospect of obtaining any.

But what is fully astonishing, as Woodward exposes in the narrative, is not only how frequently slavery and the appropriate status for African Americans is referenced in such correspondence, but how remarkably similar the language is, whether the soldier is the son of a wealthy planter or a yeoman farmer barely scraping by. In nearly every case, the righteousness of their cause is defined again and again not by the euphemism of “states’ rights” that became the rallying cry of “Lost Cause” after the war, but by the sanctity of the institution of human bondage. More than once, letters resound with a disturbing yet familiar refrain that asserted that the most fitting condition for blacks is as human property, something seen as mutually beneficial to the master as well as to the enslaved.

If those without slaves risking life and limb to sustain slavery with both musket in hand and zealous declarations in letters home provokes a kind of cognitive dissonance to modern ears, we need only be reminded of our own contemporaries in doublewides who might sound the most passionate defense of Wall Street banks.  Have-nots in America often aspire to what is beyond their reach, for themselves or for their children. For poor southern whites of the time, in and out of the Confederate army, that turns out to be slave property.

One of the greatest sins of postwar reconciliation and the tenacity of the “Lost Cause” was the erasure of African Americans from history. In the myth-making that followed Appomattox, with human bondage extinct and its practice widely reviled, the Civil War was transformed into a sectional war of white brother against white brother, and blacks were relegated to roles as bit players. The centrality of slavery was excised from the record. In the literature, blacks were generally recalled as benign servants loyal to their masters, like the terrified Prissy in Gone with the Wind screeching “De Yankees is comin!” in distress rather than the celebration more likely characteristic to that moment in real time. That a half million of the enslaved fled to freedom in Union lines was lost to memory. Also forgotten was the fact that by the end of the war, fully ten percent of the Union Army was comprised of black soldiers in the United States Colored Troops (USCT)—and these men played a significant role in the south’s defeat.  Never mentioned was that Confederate soldiers routinely executed black men in blue uniforms who were wounded or attempting to surrender, not only in well-known encounters like at Fort Pillow and the Battle of the Crater, but frequently and anonymously. As Woodward reminds us, this brand of murder was often unofficial, but rarely acknowledged, and almost never condemned. Only recently have these aspects of Civil War history received the attention that is their due.

And yet, more remarkably, Marching Masters reveals that perhaps the deepest and most enduring erasure of African Americans was of the huge cohort that accompanied the Confederate army on its various campaigns throughout the war. Thousands and thousands of them. “Lost Cause” zealots have imagined great corps of “Black Confederates” who served as fighters fending off Yankee marauders, but if that is fantasy—and it certainly is—the massive numbers of blacks who served as laborers alongside white infantry were not only real but represented a significant reason why smaller forces of Confederates held out as well as they did against their often numerically superior northern opponents. We have long known that a greater percentage of southerners were able to join the military than their northern counterparts because slave labor at home in agriculture and industry freed up men to wield saber and musket, but Woodward uncovers the long-overlooked legions of the enslaved who travelled with the rebels performing the kind of labor that (mostly) fell on white enlisted men in northern armies.

A segment of these were also personal servants to the sons of planters, which sometimes provoked jealousy among the ranks. Certain letters home plead for just such a servile companion, sometimes arguing that the enslaved person would be less likely to flee to Union lines if he was to be a cook in an army camp instead! And there were occasionally indeed tender if somewhat perversely paternalistic bonds between the homesick soldier and the enslaved, some of which found wistful expression in letters, some manifested in relationships with servants in the encampments. Many soldiers had deep attachments to the enslaved that nurtured them as children in the bosom of their families; some of that was sincerely reciprocated. Woodward makes it clear that while certain generalities can be drawn, every individual—soldier or chattel—was a human being capable of a wide range of actions and emotions, from the cruel to the heartwarming. For better or for worse, all were creatures of their times and their circumstances. But, at the end of the day, white soldiers had something like free will; enslaved African Americans were subject to the will of others, sometimes for the better but more often for the worse.

And then there was impressment. One of the major issues relatively unexplored in the literature is the resistance of white soldiers in the Confederate army to perform menial labor—the same tasks routinely done by white soldiers in the Union army, who grumbled as all those in the ranks in every army were wont to do while nevertheless following orders. But southern boys were different. Nurtured in a society firmly grounded in white supremacy, with chattel property doomed to the most onerous toil, rebels not only typically looked down upon hard work but—as comes out in their letters—equated it with “slavery.” To cope with this and an overall shortage of manpower, legislation was passed in 1863 mandating impressment of the enslaved along with a commitment of compensation to owners.  This was not well received, but yet enacted, and thousands more blacks were sent to camps to do the work soldiers were not willing to do.

The numbers were staggering. When Lee invaded Pennsylvania, his army included 6000 enslaved blacks—which added an additional ten percent to the 60,000 infantry troops he led to Gettysburg! This of course does not include the runaways and free blacks his forces seized and enslaved after he crossed the state line. The point to all of this, of course, is that slavery was not some ideological abstraction for the average rebel soldier in the ranks, something that characterized the home front, whether your own family were owners of chattel property or not. Instead, the enslaved were with you in the field every day, not figuratively but in the flesh. With this in mind, sounding a denial that slavery served as a critical motivation for Confederate troops rings decidedly off-key.

While slavery was the central cause of the war, it was certainly not the only cause. There were other tensions that included agriculture vs. industry, rural vs. urban, states’ rights vs. central government, tariffs, etc. But as historians have long concluded, none of these factors on their own could ever have led to Civil War. Likewise, southern soldiers fought for a variety of reasons. While plenty were volunteers, many were also drafted into the war effort. Like soldiers from ancient times to the present day, they fought because they were ordered to, because of their personal honor, because they did not want to appear cowardly in the eyes of their companions. And because much of the war was decided on southern soil, they also fought for their homeland, to defend their families, to preserve their independence. So Shelby Foote might have had a point. But what was that independence based upon? It was fully and openly based upon creating and sustaining a proud slave republic, as all the rhetoric in the lead-up to secession loudly underscored.

Marching Masters argues convincingly that the long-held belief that southern soldiers were indifferent to or unacquainted with the principles that guided the Confederate States of America is in itself a kind of myth that encourages us to not only forgive those who fought for a reprehensible cause but to put them on a kind of heroic pedestal. Many fought valiantly, many lost their lives, and many were indeed heroes, but we must not overlook the cause that defined that sacrifice.  In this, we must recall the speech delivered by the formerly enslaved Frederick Douglass on “Remembering the Civil War” with his plea against moral equivalency that is as relevant today as it was when he delivered it on Decoration Day in 1878: “There was a right side and a wrong side in the late war, which no sentiment ought to cause us to forget, and while today we should have malice toward none, and charity toward all, it is no part of our duty to confound right with wrong, or loyalty with treason.”

For all of the more than 60,000 books on the Civil War, there still remains a great deal to explore and much that has long been cloaked in myth for us to unravel. It is the duty not only of historians but for all citizens of our nation—a nation that was truly reborn in that tragic, bloody conflict—to set aside popular if erroneous notions of what led to that war, as well as what motivated its long-dead combatants to take up arms against one another. To that end, Woodward’s Marching Masters is a book that is not only highly recommended but is most certainly required reading.

 

——————————-

Transcript of The Civil War (1990) docuseries, Episode 1, “The Cause:” https://subslikescript.com/series/The_Civil_War-98769/season-1/episode-1-The_Cause

Comments by James M. Lundberg: https://www.theatlantic.com/national/archive/2011/06/civil-war-sentimentalism/240082/

Speech by Frederick Douglass, “Remembering the Civil War,” delivered on Decoration Day 1878:   https://www.americanyawp.com/reader/reconstruction/frederick-douglass-on-remembering-the-civil-war-1877/

Review of: Sarah’s Long Walk: The Free Blacks of Boston and How Their Struggle for Equality Changed America, by Stephen Kendrick & Paul Kendrick

Several years ago, I published an article in a scholarly journal entitled “Strange Bedfellows: Nativism, Know-Nothings, African-Americans & School Desegregation in Antebellum Massachusetts,” that spotlighted the odd confluence of anti-Irish nativism and the struggle to desegregate Boston schools. The Know-Nothings—a populist, nativist coalition that contained elements that would later be folded into the emerging Republican Party—made a surprising sweep in the Massachusetts 1854 elections, fueled primarily by anti-Irish sentiment, as well as a pent-up popular rage against the elite status quo that had long dominated state politics. Suddenly, the governor, all forty senators, and all but three house representatives were Know-Nothings!

Perhaps more startling was that during their brief tenure, the Know-Nothing legislature enacted a host of progressive reforms, creating laws to protect workingmen, ending imprisonment for debt, strengthening women’s rights in property and marriage, and—most significantly—passing landmark legislation in 1855 that “prohibited the exclusion [from public schools] of children for either racial or religious reasons,” which effectively made Massachusetts the first state in the country to ban segregation in schools! Featured in the debate prior to passage of the desegregation bill is a quote from the record that is to today’s ears perhaps at once comic and cringeworthy, as one proponent of the new law sincerely voiced his regret “that Negroes living on the outskirts . . . were forced to go a long distance to [the segregated] Smith School. . . while . . . the ‘dirtiest Irish,’ were allowed to step from their houses into the nearest school.”

My article focused on Massachusetts politics and the bizarre incongruity of nativists unexpectedly delivering the long sought-after prize of desegregated schools to the African American community. It is also the story of the nearly forgotten black abolitionist and integrationist William Cooper Nell, a mild if charismatic figure who united disparate forces of blacks and whites in a long, stubborn, determined campaign to end Boston school segregation. But there are lots of other important stories of people and events that led to that moment which due to space constraints could not receive adequate treatment in my effort.

Arguably the most significant one, which my article references but does not dwell upon, centers upon a little black girl named Sarah Roberts.  Her father, Benjamin R. Roberts, sued for equal protection rights under the state constitution because his daughter was barred from attending a school near her residence and was compelled to a long walk to the rundown and crowded Smith School instead. He was represented by Robert Morris, one of the first African American attorneys in the United States, and Charles Sumner, who would later serve as United States Senator. In April 1850, in Roberts v. The City of Boston, the state Supreme Court ruled against him, declaring that each locality could decide for itself whether to have or end segregation. This ruling was to serve as an unfortunate precedent for the ignominious separate but equal ruling in Plessy v. Ferguson some decades hence and was also an obstacle Thurgood Marshall had to surmount when he successfully argued to have the Supreme Court strike down school segregation across the nation in 1954’s breakthrough Brown v. Board of Education case—just a little more than a century after the disappointing ruling in the Roberts case.

Father and son Stephen Kendrick and Paul Kendrick teamed up to tell the Roberts story and a good deal more in Sarah’s Long Walk: The Free Blacks of Boston and How Their Struggle for Equality Changed America, an extremely well-written, comprehensive, if occasionally slow-moving chronicle that recovers for the reader the vibrant, long overlooked black community that once peopled Boston in the years before the Civil War. In the process, the authors reveal how it was that while the state of Massachusetts offered the best overall quality of life in the nation for free blacks, it was also the home to the same stark, virulent racism characteristic of much of the north in the antebellum era, a deep-seated prejudice that manifested itself not only in segregated schools but also in a strict separation in other arenas such as transportation and theaters.

Doctrines of abolition were widely despised, north and south, and while abolitionists remained a minority in Massachusetts, as well, it was perhaps the only state in the country where antislavery ideology achieved widespread legitimacy. But true history is all nuance, and those who might rail passionately against the inherent evil in holding humans as chattel property did not necessarily also advance notions of racial equality. That was indeed far less common. Moreover, it is too rarely underscored that the majority of northern “Freesoilers” who were later to become the most critical component of the Republican Party vehemently opposed the spread of slavery to the new territories acquired in the Mexican War while concomitantly despising blacks, free or enslaved.

At the same time, there was hardly unanimity in the free black community when it came to integration; some blacks welcomed separation.  Still, as Sarah’s Long Walk relates, there were a number of significant African American leaders like Robert Morris and William Cooper Nell whom, with their white abolitionist allies, played the long game and pursued compelling, nonviolent mechanisms to achieve both integration and equality, many of which presaged the tactics of Martin Luther King and other Civil Rights figures a full century later.  For instance, rather than lose hope after the Roberts court decision, Nell doubled down his efforts, this time with a new strategy—a taxpayer’s boycott of Boston which saw prominent blacks move out of the city to suburbs that featured integrated schools, depriving Boston of tax revenue.

The Kendrick’s open the narrative with a discussion of Thurgood Marshall’s efforts to overturn the Roberts precedent in Brown v. Board of Education, and then trace that back to the flesh and blood Boston inhabitants who made Roberts v. The City of Boston possible, revealing the free blacks who have too long been lost to history. Readers not familiar with this material will come across much that will surprise them between the covers of this fine book. The most glaring might be how thoroughly in the decades after Reconstruction blacks have been erased from our history, north and south. Until recently, how many growing up in Massachusetts knew anything at all about the thriving free black community in Boston, or similar ones elsewhere above the Mason-Dixon?

But most astonishing for many will be the fact that the separation of races that that would become the new normal in the post-Civil War “Jim Crow” south had its roots fully nurtured in the north decades before Appomattox.  Whites and their enslaved chattels shared lives intertwined in the antebellum south, while separation between whites and blacks was fiercely enforced in the north.  Many African Americans in Massachusetts had fled bondage, or had family members that were runaways, and knew full well that southern slaveowners commonly traveled by rail accompanied by their enslaved servants, while free blacks in Boston were relegated to a separate car until the state prohibited racial segregation in mass transportation in 1842.

Sarah may not have been spared her long walk to school, but the efforts of integrationists eventually paid off when school segregation was prohibited by Massachusetts law just five years after Sarah’s father lost his case in court. Unfortunately, this battle had to be waged all over again in the 1970s, this time accompanied by episodes of violence, as Boston struggled to achieve educational equality through controversial busing mandates that in the long term generated far more ill will than sustainable results. Despite the elevation of Thurgood Marshall to the Supreme Court bench, and the election of the first African American president, more than one hundred fifty years after the Fourteenth Amendment became the law of the land, the Black Lives Matter (BLM) movement reminds us that there is still much work to be done to achieve anything like real equality in the United States.

For historians and educators, an even greater concern these days lies in the concerted efforts by some on the political right to erase the true story of African American history from public schools. As this review goes to press in Black History Month, February 2022, shameful acts are becoming law across a number of states that by means of gaslighting legislation ostensibly designed to ban Critical Race Theory (CRT) effectively prohibit educators from teaching their students the true history of slavery, Reconstruction, and Civil Rights.  As of this morning, there are some one hundred thirteen other bills being advanced across the nation that could serve as potential gag orders in schools. How can we best combat that? One way is to loudly protest to state and federal officials, to insist that black history is also American history and should not be erased. The other is to freely share black history in your own networks. The best weapons for that in our collective arsenal are quality books like Sarah’s Long Walk.

 

My journal article, “Strange Bedfellows: Nativism, Know-Nothings, African-Americans & School Desegregation in Antebellum Massachusetts,” and related materials can be accessed by clicking here: Know-Nothings

For more about the Know-Nothings, I recommend this book which I reviewed here: Review of: The Know-Nothing Party in Massachusetts: The Rise and Fall of a People’s Movement, by John R. Mulkern

 

 

 

 

Review of: Into the Heart of the World: A Journey to the Center of the Earth, by David Whitehouse

A familiar trope in the Looney Tunes cartoons of my boyhood had Elmer Fudd or some other zany character digging a hole with such vigor and determination that they emerged on the other side of the world in China, greeted by one or more of the stereotypically racist Asian animated figures of the day.  In the 1964 Road Runner vehicle “War and Pieces,” Wile E. Coyote goes it one better, riding a rocket clear through the earth—presumably passing through its center—until he appears on the other side dangling upside down, only to then encounter a Chinese Road Runner embellished with a braided pigtail and conical hat who bangs a gong with such force that he is driven back through the tunnel to end up right where he started from. In an added flourish, the Chinese Road Runner then peeps his head out of the hole and beep-beep’s faux Chinese characters that turn into letters that spell “The End.”

There were healthy doses of both hilarious comedy and uncomfortable caricature here, but what really stuck in a kid’s mind was the notion that you could somehow burrow through the earth with a shovel or some explosive force, which it turns out is just as impossible in 2022 as it was in 1964. But if you hypothetically wanted to give it a go, you would have to start at China’s actual antipode in this hemisphere, which lies in Chile or Argentina, and then tunnel some 7,918 miles: twice the distance to the center of the earth you would pass through, which lies at around 3,959 miles (6,371 km) from the surface.

So what about the center of the earth? Could we go there? After all, we did visit the moon, and the average distance there—238,855 miles away—is far more distant.  But of course what lies between the earth and its single satellite is mostly empty space, not the crust, mantle, outer core, and inner core of a rocky earth that is a blend of the solid and the molten.  Okay, it’s a challenge, you grant, but how far have we actually made it in our effort to explore our inner planet? We must have made some headway, right? Well, it turns out that the answer is: not very much. A long, concerted effort at drilling that began in 1970 by the then Soviet Union resulted in a measly milestone of a mere 7.6 miles (12.3 km) at the Kola Superdeep Borehole near the Russian border with Norway; efforts were abandoned in 1994 because of higher-than-expected temperatures of 356 °F (180 °C). Will new technologies take us deeper one day at this site or another? Undoubtedly. But it likely will not be in the near future. After all, there’s another 3,951.4 miles to go and conditions will only grow more perilous at greater depths.

But we can dream, can’t we? Indeed. And it was Jules Verne who did so most famously when he imagined just such a trip in his classic 1864 science fiction novel, Journey to the Center of the Earth.  Astrophysicist and journalist David Whitehouse cleverly models his grand exploration of earth’s interior, Into the Heart of the World: A Journey to the Center of the Earth, on Verne’s tale, a well-written, highly accessible, and occasionally exciting work of popular science that relies on geology rather than fiction to transport the reader beneath the earth’s crust through the layers below and eventually to what we can theoretically conceive based upon the latest research as the inner core that comprises the planet’s center.

It is surprising just how few people today possess a basic understanding of the mechanics that power the forces of the earth.  But perhaps even more astonishing is how new—relatively—this science is. When I was a child watching Looney Tunes on our black-and-white television, my school textbooks admitted that although certain hypotheses had been suggested, the causes of sometimes catastrophic events such as earthquakes and volcanoes remained essentially unknown. All that changed effectively overnight—around the time my family got our first color TV—with the widespread acceptance by geologists of the theory of plate tectonics, constructed on the foundation of the much earlier hypothesis of German meteorologist and geophysicist Alfred Wegener, who in 1912 advanced the view of continents in motion known as “continental drift,” which was ridiculed in his time. By 1966, the long-dead Wegener was vindicated, and continental drift was upgraded to the more elegant model of plate tectonics that fully explained not only earthquakes and volcanoes, but mountain-building, seafloor spreading, and the whole host of other processes that power a dynamic earth.

Unlike some disciplines such as astrophysics, the basic concepts that make up earth science are hardly insurmountable to any individual with an average intelligence, so for those who have no idea how plate tectonics work and are curious enough to want to learn, Into the Heart of the World is a wonderful starting point. Whitehouse can be credited with articulating complicated processes in an easy to follow narrative that consistently holds the reader’s interest and remains fully comprehensible to the non-scientist. I came to this book with more than a passing familiarity with plate tectonics, but I nevertheless added to my knowledge base and enjoyed the way the author united disparate topics into this single theme of a journey to the earth’s inner core.

If I have a complaint, and as such it is only a quibble tied to my own preferences, Into the Heart of the World often devotes far more paragraphs to a history of “how we know what we know” rather than a more detailed explanation of the science itself. The author is not to be faulted for what is integral to the structure of the work—after all the cover does boast “A Remarkable Voyage of Scientific Discovery,” but it left me longing for more. Also, some readers may stumble over these backstories of people and events, eager instead to get to the fascinating essence of what drives the forces that shape our planet.

A running gag in more than one Bugs Bunny episode has the whacky rabbit inadvertently tunneling to the other side of the world, then admonishing himself that “I knew I shoulda taken that left turn at Albuquerque!”  He doesn’t comment on what turn he took at his juncture with the center of the earth, but many kids who sat cross-legged in front their TVs wondered what that trip might look like. For grownups who still wonder, I recommend Into the Heart of the World as your first stop.

 

[Note: this book has also been published under the alternate title, Journey to the Centre of the Earth: The Remarkable Voyage of Scientific Discovery into the Heart of Our World.]

[A link to the referenced 1964 Road Runner episode is here: War and Pieces]

%%footer%%