“Stranger, whoever you are, open this to learn what will amaze you.”
This passage, in ancient Greek and in translation, is the key to Cloud Cuckoo Land, a big, ambitious, complicated novel by Anthony Doerr, the latest from the author of the magnificent, Pulitzer Prize-winning All the Light We Cannot See (2014). Classicists will recognize “Cloud Cuckoo Land” as borrowed from The Birds, the 414 BCE comedy by the Athenian satirist Aristophanes, a city in the sky constructed by birds that later became synonymous for any kind of fanciful world. In this case, Cloud Cuckoo Land serves as the purported title of a long-lost ancient work by Antonius Diogenes, rediscovered as a damaged but partially translatable codex in 2019, that relates the tale of Aethon, a hapless shepherd who transforms into a donkey, then into a fish, then into a crow, in a quest to reach that utopian city in the clouds. It serves as well as the literary glue that binds together the narrative and the central protagonists of Doerr’s novel.
There is the octogenarian Zeno, self-taught in classical Greek, who has translated the fragmentary codex and adapted it into a play that is to be performed by fifth graders in the public library located in Lakeport, Idaho in 2020. Lurking in the vicinity is Seymour, an alienated teen with Asperger’s, flirting with eco-terrorism. And hundreds of years in the past, there is also the thirteen-year-old Anna, who has happened upon that same codex in Constantinople, on the eve of its fall to the Turks. Among the thousands of besiegers outside the city’s walls is Omeir, a harelipped youngster who with his team of oxen was conscripted to serve the Sultan in the cause of toppling the Byzantine capital. Finally, there is Konstance, fourteen years old, who has lived her entire life on the Argos, a twenty-second century spacecraft destined for a distant planet; she too comes to discover “Cloud Cuckoo Land.”
Alternating chapters, some short, others far longer, tell the stories of each protagonist, in real time or through flashbacks. For the long-lived Zeno, readers follow his hardscrabble youth, his struggle with his closeted homosexuality, his stint as a POW in the Korean War, and his long love affair with the language of the ancient Greeks. We observe how an uncertain and frequently bullied Seymour reacts to the destruction of wilderness and wildlife in his own geography. We watch the rebellious Anna abjure her work as a lowly seamstress to clandestinely translate the codex. We learn how the disfigured-at-birth Omeir is at first nearly left to die, then exiled along with his family because villagers believe he is a demon. We see Konstance, trapped in quarantine in what appears to be deep space, explore the old earth through an “atlas” in the ship’s library.
Cloud Cuckoo Land is in turn fascinating and captivating, but sometimes—unfortunately—also dull. There are not only the central protagonists to contend with, but also a number of secondary characters in each of their respective orbits, as well as the multiple timelines spanning centuries, so there is much to keep track of. I recall being so spellbound by All the Light We Cannot See that I read its entire 500-plus pages over a single weekend. This novel, much longer, did not hook me with a similar force. I found it a slow build: my enthusiasm tended to simmer rather than surge. Alas, I wanted to care about the characters far more than I did. Still, the second half of the novel is a much more exciting read than the first portion.
Science—in multiple disciplines—is often central to a Doerr novel. That was certainly the case in All the Light We Cannot See, as well as in his earlier work, About Grace. In Cloud Cuckoo Land, in contrast, science—while hardly absent—takes a backseat. The sci-fi in the Argos voyage is pretty cool, but hardly the stuff of Asimov or Heinlein. And Seymour’s science of climate catastrophe strikes as little more than an afterthought in the narrative.
Multiple individuals with lives on separate trajectories centuries apart whose exploits resonated larger and often overlapping themes reminded me at first of another work with a cloud in its title: Cloud Atlas, by David Mitchell. But Cloud Cuckoo Land lacks the spectacular brilliance of that novel, which manages to take your very breath away. It also falls short of the depth and intricacy that powers Doerr’s All the Light We Cannot See. And yet … and yet … I ended up really enjoying the book, even shedding a tear or two in its final pages. So there’s that. In the final analysis, Doerr is a talented writer and if this is not his finest work, it remains well worth the read.
I have reviewed other novels by Anthony Doerr here:
Reading the “The Epic of Gilgamesh,” in its entirety rekindled a long dormant interest in the Sumerians, the ancient Mesopotamian people that my school textbooks once boldly proclaimed as inventors not only of the written word, but of civilization itself! One of the pleasures of having a fine home library stocked with eclectic works is that there is frequently a volume near at hand to suit such inclinations, and in this case I turned to a relatively recent acquisition, The Sumerians, a fascinating and extremely well-written—if decidedly controversial—contribution to the Lost Civilizations series, by Paul Collins.
“The Epic of Gilgamesh” is, of course, the world’s oldest literary work: the earliest record of the five poems that form the heart of the epic were carved into Sumerian clay tablets that date back to 2100 BCE, and relate the exploits of the eponymous Gilgamesh, an actual historic king of the Mesopotamian city state Uruk circa 2750 BCE who later became the stuff of heroic legend. Most famously, a portion of the epic recounts a flood narrative nearly identical to the one reported in Genesis, making it the earliest reference to the Near East flood myth held in common by the later Abrahamic religions.
Uruk was just one of a number of remarkable city states—along with Eridu, Ur, and Kish—that formed urban and agricultural hubs between the Tigris and Euphrates rivers in what is today southern Iraq, between approximately 3500-2000 BCE, at a time when the Persian Gulf extended much further north, putting these cities very near the coast. Some archaeologists also placed “Ur of the Chaldees,” the city in the Hebrew Bible noted as the birthplace of the Israelite patriarch Abraham, in this vicinity, reinforcing the Biblical flood connection. A common culture that boasted the earliest system of writing that recorded in cuneiform script a language isolate unrelated to others, advances in mathematics that utilized a sexagesimal system, and the invention of both the wheel and the plow came to be attributed to these mysterious non-Semitic people, dubbed the Sumerians.
But who were the Sumerians? They were completely unknown, notes the author, until archaeologists stumbled upon the ruins of their forgotten cities about 150 years ago. Collins, who currently is Curator for Ancient Near East, Ashmolean Museum*, at University of Oxford, fittingly opens his work with the baked clay artifact known as a “prism” inscribed with the so-called Sumerian King List, circa 1800 BCE, currently housed in the Ashmolean Museum. The opening passage of the book is also the first lines of the Sumerian King List: “After the kingship descended from heaven, the kingship was in Eridu. In Eridu, Alulim became king; He ruled for 28,800 years.” Heady stuff.
“It is not history as we would understand it,” argues Collins, “but a combination of myth, legend and historical information.” This serves as a perfect metaphor for Collins’s thesis, which is that after a century and a half of archaeology and scholarship, we know less about the Sumerians—if such a structured, well-defined common culture ever even existed—and far more about the sometimes-spurious conclusions and even outright fictions that successive generations of academics and observers have attached to these ancient peoples.
Thus, Collins raises two separate if perhaps related issues that both independently and in tandem spark controversy. The first is the question of whether the Sumerians ever existed as a distinct culture, or whether—as the author suggests—scholars may have somehow mistakenly woven a misleading tapestry out of scraps and threads in the archaeological record representing a variety of inhabitants within a shared geography with material cultures that while overlapping were never of a single fabric? The second is how deeply woven into that same tapestry are distortions—some intended and others inadvertent—tailored to interpretations fraught with the biases of excavators and researchers determined to locate the Sumerians as uber-ancestors central to the myth of Western Civilization that tends to dominate the historiography? And, of course, if there is merit to the former, was it entirely the product of the latter, or were other factors involved?
I personally lack both the expertise and the qualifications to weigh in on the first matter, especially given that its author’s credentials include not only an association with Oxford’s School of Archaeology, but also as the Chair of the British Institute for the Study of Iraq. Still, I will note in this regard that he makes many thought-provoking and salient points. As to the second, Collins is quite persuasive, and here great authority on the part of the reader is not nearly as requisite.
Nineteenth century explorers and archaeologists—as well as their early twentieth century successors—were often drawn to this Middle Eastern milieu in a quest for concordance between Biblical references and excavations, which bred distortions in outcomes and interpretation. At the same time, a conviction that race and civilization were inextricably linked—to be clear, the “white race” and “Western Civilization”—determined that what was perceived as “advanced” was ordained at the outset for association with “the West.” We know that the leading thinkers of the Renaissance rediscovered the Greeks and Romans as their cultural and intellectual forebears, with at least some measure of justification, but later far more tenuous links were drawn to ancient Egypt—and, of course, later still, to Babylon and Sumer. Misrepresentations, deliberate or not, were exacerbated by the fact that the standards of professionalism characteristic to today’s archaeology were either primitive or nonexistent.
None of this should be news to students of history who have observed how the latest historiography has frequently discredited interpretations long taken for granted—something I have witnessed firsthand as a dramatic work in progress in studies of the American Civil War in recent decades: notably, although slavery was central to the cause of secession and war, for more than a century African Americans were essentially erased from the textbooks and barely acknowledged other than at the very periphery of the conflict, in what was euphemistically constructed as a sectional struggle among white men, north and south. It was a lie, but a lie that sold very well for a very long time, and still clings to those invested in what has come to called “Lost Cause” mythology.
But yet it’s surprising, as Collins underscores, that what should long have been second-guessed about Sumer remains integral to far too much of what persists as current thinking. Whether the Sumerians are indeed a distinct culture or not, should those peoples more than five millennia removed from us continue to be artificially attached to what we pronounce Western Civilization? Probably not. And while we certainly recognize today that race is an artificial construct that relates zero information of importance about a people, ancient or modern, we can reasonably guess with some confidence that those indigenous to southern Iraq in 3500 BCE probably did not have the pale skin of a native of, say, Norway. We can rightfully assert that the people we call the Sumerians were responsible for extraordinary achievements that were later passed down to other cultures that followed, but an attempt to draw some kind of line from Sumer to Enlightenment-age Europe is shaky, at best.
As such, Collins’s book gives focus to what we have come to believe about the Sumerians, and why we should challenge that. I previously read (and reviewed) Egypt by Christina Riggs, another book in the Lost Civilizations series, which is preoccupied with how ancient Egypt has resonated for those who walked in its shadows, from Roman tourists to Napoleon’s troops to modern admirers, even if that vision little resembles its historic basis. Collins takes a similar tack but devotes far more attention to parsing out in greater detail exactly what is really known about the Sumerians and what we tend to collectively assume that we know. Of course, Sumer is far less familiar to a wider audience, and it lacks the romantic appeal of Egypt—there is no imagined exotic beauty like Cleopatra, only the blur of the distant god-king Gilgamesh—so the Sumerians come up far more rarely in conversation, and provoke far less strong feelings, one way or the other.
The Sumerians is a an accessible read for the non-specialist, and there are plenty of illustrations to enhance the text. Like other authors in the Lost Civilizations series, Collins deserves much credit for articulating sometimes arcane material in a manner that suits both a scholarly and a popular audience, which is by no means an easy achievement. If you are looking for an outstanding introduction to these ancient people that is neither too esoteric nor dumbed-down, I highly recommend this volume.
*NOTE: I recently learned that Paul Collins has apparently left the Ashmolean Museum as of end October 2022, and is now associated with the Middle East Department, British Museum.
Imagine if God—or Gary Larson—had an enormous mayonnaise shaped jar at his disposal and stuffed it chock full of the collective consciousnesses of the greatest modern philosophers, psychoanalysts, neuroscientists, mathematicians, physicists, quantum theoreticians, and cosmologists … then lightly dusted it with a smattering of existential theologians, eschatologists, dream researchers, and violin makers, before tossing in a handful of race car drivers, criminals, salvage divers, and performers from an old-time circus sideshow … and next layered it with literary geniuses, heavy on William Faulkner and Ernest Hemingway with perhaps a dash of Haruki Murakami and just a smidge of Dashiell Hammett … before finally tossing in Socrates, or at least Plato’s version of Socrates, who takes Plato along with him because—love him or hate him—you just can’t peel Plato away from Socrates. Now imagine that giant jar somehow being given a shake or two before being randomly dumped into the multiverse, so that all the blended yet still unique components poured out into our universe as well into other multiple hypothetical universes. If such a thing was possible, the contents that spilled forth might approximate The Passenger and Stella Maris, the pair of novels by Cormac McCarthy that has so stunned readers and critics alike that there is yet no consensus whether to pronounce these works garbage or magnificent—or, for that matter, magnificent garbage.
The eighty-nine-year-old McCarthy, perhaps America’s greatest living novelist, released these companion books in 2022 after a sixteen-year hiatus that followed publication of The Road, the 2006 postapocalyptic sensation that explored familiar Cormac McCarthy themes in a very different genre, employing literary techniques strikingly different from his previous works, and in the process finding a whole new audience. The same might be said, to some degree, of the novel that preceded it just a year earlier, No Country for Old Men, another clear break from his past that was after all a radical departure for readers of say, The Border Trilogy, and his magnum opus, Blood Meridian, which to my mind is not only a superlative work but truly one of the finest novels of the twentieth century.
Full disclosure: I have read all of Cormac McCarthy’s novels, as well as a play and a screenplay that he authored. To suggest that I am a fan would be a vast understatement. My very first McCarthy novel was The Crossing, randomly plucked from a grocery store magazine rack while on a family vacation. That was 2008. I inhaled the book and soon set out to read his full body of work. The Crossing is actually the middle volume in The Border Trilogy, preceded by All the Pretty Horses and followed by Cities of the Plain, which collectively form a near- Shakespearean epic of the American southwest and the Mexican borderlands in the mid-twentieth century, which yet retain a stark primitivism barely removed from the milieu of Blood Meridian, set a full century earlier. The author’s style, in these sagas and beyond, has at times by critics been compared with both Faulkner and Hemingway, both favorably and unfavorably, but McCarthy’s voice is distinctive, and hardly derivative. There is indeed the rich vocabulary of a Faulkner or a Styron, that add a richness to the quality of the prose even as it challenges readers to sometimes seek out the dictionary app on their phones. There is also a magnificent use of the objective correlative, made famous by Hemingway and later in portions of the works of Gabriel Garcia Márquez, which evokes powerful emotions from inanimate objects. For McCarthy, this often manifests in the vast, seemingly otherworldly geography of the southwest. McCarthy also frequently makes use of Hemingway’s polysyndetic syntax that adds emphasis to sentences through a series of conjunctions. Most noticeable for those new to Cormac McCarthy is his omission of most traditional punctuation, such as quotation marks, which often improves the flow of the narrative even as it sometimes lends to a certain confusion in long dialogues between two characters that span several pages.
The Passenger opens with the prologue of a Christmas day suicide that must be recited in the author’s voice as an underscore to the beauty of his prose:
It had snowed lightly in the night and her frozen hair was gold and crystalline and her eyes were frozen cold and hard as stones. One of her yellow boots had fallen off and stood in the snow beneath her. The shape of her coat lay dusted in the snow where she’d dropped it and she wore only a white dress and she hung among the bare gray poles of the winter trees with her head bowed and her hands turned slightly outward like those of certain ecumenical statues whose attitude asks that their history be considered. That the deep foundation of the world be considered where it has its being in the sorrow of her creatures. The hunter knelt and stogged his rifle upright in the snow beside him … He looked up into those cold enameled eyes glinting blue in the weak winter light. She had tied her dress with a red sash so that she’d be found. Some bit of color in the scrupulous desolation. On this Christmas day.
With a poignancy reminiscent of the funeral of Peyton Loftis, also a suicide, in the opening of William Styron’s Lie Down in Darkness, the reader here encounters whom we later learn is Alicia Western, one of the two central protagonists in The Passenger and its companion volume, who much like Peyton in Styron’s novel haunts the narrative with chilling flashbacks. Ten years have passed when, on the very next page, we meet her brother Bobby, a salvage diver exploring a submerged plane wreck who happens upon clues that could put his life in jeopardy among those seeking something missing from that plane. Bobby is a brilliant intellect who could have been a physicist, but instead spends his life chasing down whatever provokes his greatest psychological fears. In this case, the terror of being deep underwater has driven him to salvage work in the oceans. Bobby is also a rugged and resourceful man’s man, a kind of Llewelyn Moss from No Country for Old Men, but with a much higher I.Q. Finally, Bobby, now thirty-seven years old, has never recovered from the death of his younger sister, with whom he had a close, passionate—and possibly incestuous—relationship.
Also integral to the plot is their now deceased father, a physicist who was once a key player in the Manhattan Project that produced the first atomic bombs that obliterated Hiroshima and Nagasaki. Their first names—Alicia and Bobby—seem to be an ironic echo of the “Alice and Bob” characters that are used as placeholders in science experiments, especially in physics. Their surname, Western, could be a kind of doomed metaphor for the tragedy of mass murder on a scale never before imagined that has betrayed the promise of western civilization in the twentieth century and in its aftermath.
A real sense of doom, and a mounting paranoia, grips the narrative in general and Bobby in particular, in what appears to be a kind of mystery/thriller that meanders about, sometimes uncertainly. The cast of characters are extremely colorful, from a Vietnam veteran whose only regret from the many lives he brutally spent while in-country are the elephants that he exploded with rockets from his gunship just for fun, to a small-time swindler with a wallet full of credit cards that don’t belong to him, and a bombshell trans woman with a heart of gold. Some of these folks are like the sorts that turn up in John Steinbeck’s Tortilla Flat, but on steroids, and more likely to suffer an unpredictable death.
But it is Alicia who steals the show in flashback fragments that reveal a stunningly beautiful young woman whose own brilliance in mathematics, physics, and music overshadows even Bobby. She seems to be schizophrenic, plagued by extremely well-defined hallucinations of bedside visitors who could be incarnates of walk-ons from an old-time circus sideshow, right out of central casting. The most prominent is the “Thalidomide Kid”—replete with the flippers most commonly identified with those deformities—who engages her as interlocutor with long-winded, fascinating, and often disturbing dialogue that can run to several pages. Alicia has been on meds, and has checked herself into institutions, but in the end, she becomes convinced both that her visitors are real and that she herself does not belong in this world. But is Alicia even human? There are passing hints that she could be an alien, or perhaps from another universe.
There’s much more, including an episode where “The Kid,” Alicia’s hallucination (?) takes a long walk on the beach with Bobby. This is surprising, if only because McCarthy has long pilloried the magical realism that frequently populates the novels of Garcia Márquez or Haruki Murakami. Perhaps “The Kid” is no hallucination, after all? In any event, much like a Murakami novel—think 1Q84, for example—there are multiple plot lines in The Passenger that go nowhere, and the reader is left frustrated by the lack of resolution. And yet … and yet, the characters are so memorable, and the quality of the writing is so exceptional, that the cover when finally closed is closed without an ounce of regret for the experience. And at the same time, the reader demands more.
The “more” turns out to be Stella Maris, the companion volume that is absolutely essential to broadening your awareness of the plot of The Passenger. Stella Maris is a mental institution that Alicia—then a twenty-year-old dropout from a doctoral program in mathematics—has checked herself into one final time, in the very last year of her life, and so a full decade before the events recounted in The Passenger. She has no luggage, but forty thousand dollars in a plastic bag which she attempts to give to a receptionist. Bobby, in those days a race car driver, lies in a coma as the result of a crash. He is not expected to recover, but Alicia refuses to remove him from life support. The full extent of the novel is told solely in transcript form through the psychiatric sessions of Alicia and a certain Dr. Cohen, but it is every bit a Socratic dialogue of science and philosophy and the existential meaning of life—not only Alicia’s life, but all of our lives, collectively. And finally, there is the dark journey to the eschatological. Alicia—and I suppose by extension Cormac McCarthy—doesn’t take much stock in a traditional, Judeo-Christian god, which has to be a myth, of course. At the same time, she has left atheism behind: there has to be something, in her view, even if she cannot identify it. But most terrifying, Alicia has a certainty that there lies somewhere an undiluted force of evil, something she terms the “Archatron,” that we all resist, even if there is a futility to that resistance.
I consider myself an intelligent and well-informed individual, but reading The Passenger, and especially Stella Maris, was immeasurably humbling. I felt much as I did the first time that I read Faulkner’s The Sound and the Fury, and even the second time that I read Gould’s Book of Fish, by Richard Flanagan. As if there are minds so much greater than mine that I cannot hope to comprehend all that they have to share, but yet I can take full pleasure in immersing myself in their work. To borrow a line from Alicia, in her discussion of Oswald Spengler in Stella Maris, we might say also of Cormac McCarthy: “As with the general run of philosophers—if he is one—the most interesting thing was not his ideas but just the way his mind worked.”
Still reeling from the pandemic, the world was rocked to its core on February 24, 2022, when Russian tanks rolled into Ukraine, an act of unprovoked aggression not seen in Europe since World War II that conjured up distressing historical parallels. If there were voices that previously denied the echo of Hitler’s Austrian Anschluss to Putin’s annexation of Crimea, as well as German adventurism in Sudetenland with Russian-sponsored separatism in the Donbas, there was no mistaking the similarity to the Nazi invasion of Poland in 1939. But it was Vladimir Putin’s challenge to the very legitimacy of Kyiv’s sovereignty—a shout out to the Kremlin’s rising chorus of irredentism that declares Ukraine a wayward chunk of the “near abroad” that is rightly integral to Russia—that compels us to look much further back in history.
Putin’s claim, however dubious, begs a larger question: by what right can any nation claim self-determination? Is Ukraine really just a modern construct, an opportunistic product of the collapse of the USSR that because it was historically a part of Russia should be once again? Or, perhaps counter-intuitively, should western Russia instead be incorporated into Ukraine? Or—let’s stretch it a bit further—should much of modern Germany rightly belong to France? Or vice versa? From a contemporary vantage point, these are tantalizing musings that challenge the notions of shifting boundaries, the formation of nation states, fact-based if sometimes uncomfortable chronicles of history, the clash of ethnicities, and, most critically, actualities on the ground. Naturally, such speculation abruptly shifts from the purely academic to a stark reality at the barrel of a gun, as the history of Europe has grimly demonstrated over centuries past.
To learn more, I turned to the recently updated edition of The Gates of Europe: A History of Ukraine, by historian and Harvard professor Serhii Plokhy, a dense, well-researched, deep dive into the past that at once fully establishes Ukraine’s right to exist, while expertly placing it into the context of Europe’s past and present. For those like myself largely unacquainted with the layers of complexity and overlapping hegemonies that have long dominated the region, it turns out that there is much to cover. At the same time, the wealth of material that strikes as unfamiliar places a strong and discouraging underscore to the western European bias in the classroom—which at least partially explains why it is that even those Americans capable of locating Ukraine on a map prior to the invasion knew almost nothing of its history.
Survey courses in my high school covered Charlamagne’s 9th century empire that encompassed much of Europe to the west, including what is today France and Germany, but never mentioned Kievan Rus’—the cultural ancestor of modern Ukraine, Belarus and Russia—that was in the 10th and 11th centuries the largest and by far the most powerful state on the continent, until it fragmented and then fell to Mongol invaders! To its east, the Grand Principality of Moscow, a 13th century Rus’ vassal state of the Mongols, formed the core of the later Russian Empire. In the 16th and 17th centuries, the Polish–Lithuanian Commonwealth was in its heyday among the largest and most populous on the continent, but both Poland and Lithuania were to fall to partition by Russia, Prussia, and Austria, and effectively ceased to exist for more than a century. Also missing from maps, of course, were Italy and Germany, which did not even achieve statehood until the later 19th century. And the many nations of today’s southeastern Europe were then provinces of the Ottoman Empire. That is European history, complicated and nuanced, as history tends to be.
Plokhy’s erudite study restores from obscurity Ukraine’s past and reveals a people who while enduring occupation and a series of partitions never abandoned an aspiration to sovereignty that was not to be realized until the late 20th century. Once a dominant power, Ukraine was to be overrun by the Mongols, preyed upon for slave labor by the Crimean Khanate, and throughout the centuries sliced up into a variety of enclaves ruled by the Golden Horde, the Polish–Lithuanian Commonwealth, the Austrian Empire, the Tsardom of Russia, and finally the Soviet Union.
That long history was written with much blood and suffering inflicted by its various occupiers. Just in the last hundred years that included Soviet campaigns of terror, ethnic cleansing, and deportations, as well as the catastrophic Great Famine of 1932–33—known as the “Holodomor”—a product of Stalin’s forced collectivization that led to the starvation deaths of nearly four million Ukrainians. Then there was World War II, which claimed another four million lives, including about a million Jews. The immediate postwar period was marked by more tumult and bloodshed. Stability and a somewhat better quality of life emerged under Nikita Khrushchev, who himself spent many years of residence in Ukraine. It was Khrushchev who transferred title of the Crimea to Ukraine in 1954. The final years under Soviet domination saw the Chernobyl nuclear disaster.
The structure of the USSR was manifested in political units known as Soviet Socialist Republics, which asserted a fictional autonomy subject to central control. Somewhat ironically, as time passed this enabled and strengthened nationalism within each of the respective SSR’s. Ukraine (like Belarus) even held its own United Nations seat, although its UN votes were rubber-stamped by Moscow. Still, this further reinforced a sense of statehood, which was realized in the unexpected dissolution of the Soviet Union and Ukraine’s independence in 1991. In the years that followed, as Ukraine aspired to closer ties with the West, that statehood increasingly came under attack by Putin, who spoke in earnest of a “Greater Russia” that by all rights included Ukraine. Election meddling became common, but with the spectacular fall of the Russian-backed president in 2014, Putin annexed Crimea and fomented rebellion that sought to create breakaway “republics” in the Donbas of eastern Ukraine. This only intensified the desire of Kyiv for integration with the European Union and for NATO membership.
A vast country of forest and steppe, marked by fertile plains crisscrossed by rivers, Ukraine has long served as a strategic gateway between the east and west, as emphasized in the book’s title. Elements of western, central, and eastern Europe all in some ways give definition to Ukrainian life and culture, and as such Ukraine remains inextricably as much a part of the west as the east. While Russia has left a huge imprint upon the nation’s DNA, it hardly informs the entirety of its national character. The Russian language continues to be widely spoken, and at least prior to the invasion many Ukrainians had Russian sympathies—if never a desire for annexation! For Ukrainians, stateless for too long, their own national identity ever remained unquestioned. The Russian invasion has, rather than threatened that, only bolstered it.
Today, Ukraine is the second largest European nation, after Russia. Far too often overlooked by both statesmen and talking heads, Ukraine would also be the world’s third largest nuclear power—and would have little to fear from the tanks of its former overlord—had it not given up its stockpile of nukes in a deal brokered by the United States, an important reminder to those who question America’s obligation to defend Ukraine.
As this review goes to press, Russia’s war—which Putin euphemistically terms a “special military operation”—is going very poorly, and despite energy supply shortages and threats of nuclear brinksmanship, the West stands firmly with Ukraine, which in the course of the conflict has been subjected to horrific war crimes by Russian invaders. However, as months pass, and both Europe and the United States endure the economic pain of inflation and rising fuel prices, as well as the ever-increasing odds of rightwing politicians gaining political power on both sides of the Atlantic, it remains to be seen if this alliance will hold steady. As battlefield defeats mount, and men and materiel run short, Putin seems to be running out the clock in anticipation of that outcome. We can only hope it does not come to that.
While I learned a great deal from The Gates of Europe, and I would offer much acclaim to its scholarship, there are portions that can prove to be a slog for a nonacademic audience. Too much of the author’s chronicle reads like a textbook—pregnant with names and dates and events—and thus lacks the sweep of a grand thematic narrative inspiring to the reader and so deserving of the Ukrainian people he treats. At the same time, that does not diminish Plokhy’s achievement in turning out what is certainly the authoritative history of Ukraine. With their right to exist under assault once more, this volume serves as a powerful defense—the weapon of history—against any who might challenge Ukraine’s sovereignty. If you believe, as I do, that facts must triumph over propaganda and polemic, then I highly recommend that you turn to Plokhy to best refute Putin.
I often suffer pangs of guilt when a volume received through an early reviewer program languishes on the shelf unread for an extended period. Such was the case with the “Advanced Reader’s Edition” of After the Apocalypse: America’s Role in the World Transformed, by Andrew Bacevich, that arrived in August 2021 and sat forsaken for an entire year until it finally fell off the top of my TBR (To-Be-Read) list and onto my lap. While hardly deliberate, my delay was no doubt neglectful. But sometimes neglect can foster unexpected opportunities for evaluation. More on that later.
First, a little about Andrew Bacevich. A West Point graduate and platoon leader in Vietnam 1970-71, he went on to an army career that spanned twenty-three years, including the Gulf War, retiring with the rank of Colonel. (It is said his early retirement was due to being passed over for promotion after taking responsibility for an accidental explosion at a camp he commanded in Kuwait.) He later became an academic, Professor Emeritus of International Relations and History at Boston University, and one-time director of its Center for International Relations (1998-2005). He is now president and co-founder of the bipartisan think-tank, the Quincy Institute for Responsible Statecraft. Deeply influenced by the theologian and ethicist Reinhold Niebuhr, Bacevich was once tagged as a conservative Catholic historian, but he defies simple categorization, most often serving as an unlikely voice in the wilderness decrying America’s “endless wars.” He has been a vocal, longtime critic of George W. Bush’s doctrine of preventative war, most prominently manifested in the Iraqi conflict, which he has rightly termed a “catastrophic failure.” He has also denounced the conceit of “American Exceptionalism,” and chillingly notes that the reliance on an all-volunteer military force translates into the ongoing, almost anonymous sacrifice of our men and women for a nation that largely has no skin in the game. His own son, a young army lieutenant, was killed in Iraq in 2007. I have previously read three other Bacevich works. As I noted in a review of one of these, his resumé attaches to Bacevich either enormous credibility or an axe to grind, or perhaps both. Still, as a scholar and gifted writer, he tends to be well worth the read.
The “apocalypse” central to the title of this book takes aim at the chaos that engulfed 2020, spawned by the sum total of the “toxic and divisive” Trump presidency, the increasing death toll of the pandemic, an economy in free fall, mass demonstrations by Black Lives Matter proponents seeking long-denied social justice, and rapidly spreading wildfires that dramatically underscored the looming catastrophe of global climate change. [p.1-3] Bacevich takes this armload of calamities as a flashing red signal that the country is not only headed in the wrong direction, but likely off a kind of cliff if we do not immediately take stock and change course. He draws odd parallels with the 1940 collapse of the French army under the Nazi onslaught, which—echoing French historian Marc Bloch—he lays to “utter incompetence” and “a failure of leadership” at the very top. [p.xiv] This then serves as a head-scratching segue into a long-winded polemic on national security and foreign policy that recycles familiar Bacevich themes but offers little in the way of fresh analysis. This trajectory strikes as especially incongruent given that the specific litany of woes besetting the nation that populate his opening narrative have—rarely indeed for the United States—almost nothing to do with the military or foreign affairs.
If ever history was to manufacture an example of a failure of leadership, of course, it would be hard-pressed to come up with a better model than Donald Trump, who drowned out the noise of a series of mounting crises with a deafening roar of self-serving, hateful rhetoric directed at enemies real and imaginary, deliberately ignoring the threat of both coronavirus and climate change, while stoking racial tensions. Bacevich gives him his due, noting that his “ascent to the White House exposed gaping flaws in the American political system, his manifest contempt for the Constitution and the rule of law placing in jeopardy our democratic traditions.” [p.2] But while he hardly masks his contempt for Trump, Bacevich makes plain that there’s plenty of blame to go around for political elites in both parties, and he takes no prisoners, landing a series of blows on George W. Bush, Barack Obama, Hillary Clinton, Joe Biden, and a host of other members of the Washington establishment that he holds accountable for fostering and maintaining the global post-Cold War “American Empire” responsible for the “endless wars” that he has long condemned. He credits Trump for urging a retreat from alliances and engagements, but faults the selfish motives of an “America First” predicated on isolationism. Bacevich instead envisions a more positive role for the United States in the international arena—one with its sword permanently sheathed.
All this is heady stuff, and regardless of your politics many readers will find themselves nodding their heads as Bacevich makes his case, outlining the many wrongheaded policy endeavors championed by Republicans and Democrats alike for a wobbly superpower clinging to an outdated and increasingly irrelevant sense of national identity that fails to align with the global realities of the twenty-first century. But then, as Bacevich looks to the future for alternatives, as he seeks to map out on paper the next new world order, he stumbles, and stumbles badly, something only truly evident in retrospect when viewing his point of view through the prism of the events that followed the release of After the Apocalypse in June 2021.
Bacevich has little to add here to his longstanding condemnation of the U.S. occupation of Afghanistan, which after two long decades of failed attempts at nation-building came to an end with our messy withdrawal in August 2021, just shortly after this book’s publication. President Biden was pilloried for the chaotic retreat, but while his administration could rightly be held to account for a failure to prepare for the worst, the elephant in that room in the Kabul airport where the ISIS-K suicide bomber blew himself up was certainly former president Trump, who brokered the deal to return Afghanistan to Taliban control. Biden, who plummeted in the polls due to outcomes he could do little to control, was disparaged much the same way Obama once was when he was held to blame for the subsequent turmoil in Iraq after effecting the withdrawal of U.S. forces agreed to by his predecessor, G.W. Bush. Once again, history rhymes. But the more salient point for those of us who share, as I do, Bacevich’s anti-imperialism, is that getting out is ever more difficult than going in.
But Bacevich has a great deal to say in After the Apocalypse about NATO, an alliance rooted in a past-tense Cold War stand-off that he pronounces counterproductive and obsolete. Bacevich disputes the long-held mythology of the so-called “West,” an artificial “sentiment” that has the United States and European nations bound together with common values of liberty, human rights, and democracy. Like Trump—who likely would have acted upon this had he been reelected—Bacevich calls for an end to US involvement with NATO. The United States and Europe have embarked on “divergent paths,” he argues, and that is as it should be. The Cold War is over. Relations with Russia and China are frosty, but entanglement in an alliance like NATO only fosters acrimony and fails to appropriately adapt our nation to the realities of the new millennium.
It is an interesting if academic argument that was abruptly crushed under the weight of the treads of Russian tanks in the premeditated invasion of Ukraine February 24, 2022. If some denied the echo of Hitler’s 1938 Austrian Anschluss to Putin’s 2014 annexation of Crimea, there was no mistaking the similarity of unprovoked attacks on Kyiv and sister cities to the Nazi war machine’s march on Poland in 1939. And yes, when Biden and French President Emmanuel Macron stood together to unite that so-called West against Russian belligerence, the memory of France’s 1940 defeat was hardly out of mind. All of a sudden, NATO became less a theoretical construct and somewhat more of a safe haven against brutal militarism, wanton aggression, and the unapologetic war crimes that livestream on twenty-first century social media of streets littered with the bodies of civilians, many of them children. All of a sudden, NATO is pretty goddamned relevant.
In all this, you could rightly argue against the wrong turns made after the dissolution of the USSR, of the failure of the West to allocate appropriate economic support for the heirs of the former Soviet Union, of how a pattern of NATO expansion both isolated and antagonized Russia. But there remains no legitimate defense for Putin’s attempt to invade, besiege, and absorb a weaker neighbor—or at least a neighbor he perceived to be weaker, a misstep that could lead to his own undoing. Either way, the institution we call NATO turned out to be something to celebrate rather than deprecate. The fact that it is working exactly the way it was designed to work could turn out to be the real road map to the new world order that emerges in the aftermath of this crisis. We can only imagine the horrific alternatives had Trump won re-election: the U.S. out of NATO, Europe divided, Ukraine overrun and annexed, and perhaps even Putin feted at a White House dinner. So far, without firing a shot, NATO has not only saved Ukraine; arguably, it has saved the world as we know it, a world that extends well beyond whatever we might want to consider the “West.”
As much as I respect Bacevich and admire his scholarship, his informed appraisal of our current foreign policy realities has turned out to be entirely incorrect. Yes, the United States should rein in the American Empire. Yes, we should turn away from imperialist tendencies. Yes, we should focus our defense budget solely on defense, not aggression, resisting the urge to try to remake the world in our own image for either altruism or advantage. But at the same time, we must be mindful—like other empires in the past—that retreat can create vacuums, and we must be ever vigilant of what kinds of powers may fill those vacuums. Because we can grow and evolve into a better nation, a better people, but that evolution may not be contagious to our adversaries. Because getting out remains ever more difficult than going in.
Finally, a word about the use of the term “apocalypse,” a characterization that is bandied about a bit too frequently these days. 2020 was a pretty bad year, indeed, but it was hardly apocalyptic. Not even close. Despite the twin horrors of Trump and the pandemic, we have had other years that were far worse. Think 1812, when the British burned Washington and sent the president fleeing for his life. And 1862, with tens of thousands already lying dead on Civil War battlefields as the Union army suffered a series of reverses. And 1942, still in the throes of economic depression, with Germany and Japan lined up against us. And 1968, marked by riots and assassinations, when it truly seemed that the nation was unraveling from within. Going forward, climate change may certainly breed apocalypse. So might a cornered Putin, equipped with an arsenal of nuclear weapons and diminishing options as Russian forces in the field teeter on collapse. But 2020 is already in the rear-view mirror. It will no doubt leave a mark upon us, but as we move on, it spins ever faster into our past. At the same time, predicting the future, even when armed with the best data, is fraught with unanticipated obstacles, and grand strategies almost always lead to failure. It remains our duty to study our history while we engage with our present. Apocalyptic or not, it’s all we’ve got …
On October 4, 1957, the Soviet Union sent geopolitical shock waves across the planet with the launch of Sputnik 1, the first artificial Earth satellite. Sputnik was only twenty-three inches in diameter, transmitted radio signals for a mere twenty-one days, then burned up on reentry just three months after first achieving orbit, but it literally changed everything. Not only were the dynamics of the Cold War permanently altered by what came to be dubbed the “Space Race,” but the success of Sputnik ushered in a dramatic new era for developments in science and technology. I was not quite six months old.
America was to later win that race to the moon, but despite its fearsome specter as a diabolical power bent on world domination, the USSR turned out to be a kind of vast Potemkin Village that almost noiselessly went out of business at the close 1991. The United States had pretty much lost interest in space travel by then, but that was just about the time that the next critical phase in the emerging digital age—widespread public access to personal computers and the internet—first wrought the enormous changes upon the landscape of American life that today might have Gen Z “zoomers” considering 1957 as something like a date out of ancient times.
And now, as this review goes to press—in yet still one more recycle of Mark Twain’s bon mot “History Doesn’t Repeat Itself, but It Often Rhymes”—NASA temporarily scrubbed the much anticipated blastoff of lunar-bound Artemis I, but a real space race is again fiercely underway, although this time the rivals include not only Russia, but China and a whole host of billionaires, at least one of whom could potentially fit a template for a “James Bond” style villain. And while all this is going on, I recently registered for Medicare.
Sixty-five years later, there’s a lot to look back on. In 1957: The Year That Launched the American Future (2020), a fascinating, fast-paced chronicle manifested by articulately rendered, thought-provocative chapter-length essays, author and journalist Eric Burns reminds us of what a pivotal year that proved to be, not only by kindling that first contest to dominate space, but in multiple other arenas of the social, political, and cultural, much that is only apparent in retrospect.
That year, while Sputnik stoked alarms that nuclear-armed Russians would annihilate the United States with bombs dropped from outer space, tabloid journalism reached what was then new levels of the outrageous exploiting “The Mad Bomber of New York,” who turned out to be a pathetic little fellow whose series of explosives actually claimed not a single fatality. In another example of history’s unintended consequences, a congressional committee investigating illegal labor activities helped facilitate Jimmy Hoffa’s takeover of the Teamsters. A cloak of mystery was partially lifted from organized crime activities with a very public police raid at Apalachin that rounded up Mafia bosses by the score. The iconic ’57 Chevy ruled the road and cruised on newly constructed interstate highways that would revolutionize travel as well as wreak havoc on cityscapes. African Americans remained second-class citizens but struggles for equality ignited a series of flashpoints. In September 1957, President Eisenhower federalized the Arkansas National Guard and sent Army troops to Little Rock to enforce desegregation. That same month, Congress passed the Civil Rights Act of 1957, watered-down yet still landmark legislation that paved the way for more substantial action ahead. Published that year were Jack Kerouac’s On the Road and Nevil Shute’s On the Beach. Michael Landon starred in I Was a Teenage Werewolf. Little Richard, who claimed to see Sputnik while performing in concert and took it as a message from God, abruptly walked off stage and abandoned rock music to preach the word of the Lord. But the nation’s number one hit was Elvis Presley’s All Shook Up; rock n’ roll was here to stay.
Burns’ commentary on all this and more is engaging and generally a delight to read, but 1957 is by no means a comprehensive history of that year. In fact, it is a stretch to term this book a history at all except in the sense that the events it describes occurred in the past. Instead, it is rather a subjective collection of somewhat loosely linked commentaries that spotlight specific events and emerging trends that the author identifies as formative for the nation we would become in decades that followed. As such, the book succeeds due to Burn’s keen sense of how both key episodes as well as more subtle cultural waves influenced a country in transition from the conventional, consensus-driven postwar years to the radicalized, tumultuous times that lay just ahead.
His insight is most apparent in his cogent analysis of how Civil Rights advanced not only through lunch-counter sit-ins and a reaction that was marked by violent repression, but by cultural shifts among white Americans—and that rock n’ roll had at least some role in this evolution of outlooks. At the same time, his conservative roots are exposed in his treatment of On the Road and the rise of the “Beat generation;” Burns genuinely seems as baffled by their emergence as he is amazed that anyone could praise Kerouac’s literary talents. But, to his credit, he recognizes the impact the novel has upon a national audience that no longer could confidently boast of a certainty in its destiny. And it is Burns’ talent with a pen that captivates a markedly different audience, some sixty-five years later.
In the end, the author leaves us yearning for more. After all, other than references that border on the parenthetical to Richard Nixon, Robert F. Kennedy, and Dag Hammarskjöld, there is almost no discussion of national politics or international relations, essential elements in any study of a nation at what the author insists is at a critical juncture. Even more problematic, very conspicuous in its absence is the missing chapter that should have been devoted to television. In 1950, 3.9 million TV sets were in less than ten percent of American homes. By 1957, that number increased roughly tenfold to 38.9 million TVs in the homes of nearly eighty percent of the population! That year, I Love Lucy aired its final half-hour episode, but in addition to network news, families were glued to their black-and-white consoles watching Gunsmoke, Alfred Hitchcock, Lassie, You Bet Your Life, and Red Skelton. For the World War II generation, technology that brought motion pictures into their living rooms was something like miraculous. Nothing was more central to the identity of the life of the average American in 1957 than television, but Burns inexplicably ignores it.
Other than Sputnik, which clearly marked a turning-point for science and exploration, it is a matter of some debate whether 1957 should be singled out for demarcation as the start of a new era. One could perhaps argue instead for the election of John F. Kennedy in 1960, or with even greater conviction, for the date of his assassination in 1963, as a true crossroads of sorts for the past and future United States. Still, if for no other reason than the conceit that this was my birth year, I am willing to embrace Burns’ thesis that 1957 represented a collective critical moment for us all. Either way, his book promises an impressive tour of a time that seems increasingly more distant with the passing of each and every day.
Early in 2022, I saw Casablanca on the big screen for the first time, the 80th anniversary of its premiere. Although over the years I have watched it in excess of two dozen times, this was a stunning, even mesmerizing experience for me, not least because I consider Casablanca the finest film of Old Hollywood—this over the objections of some of my film-geek friends who would lobby for Citizen Kane in its stead. Even so, most would concur with me that its star, Humphrey Bogart, was indeed the greatest actor of that era.
Attendance was sparse, diminished by a resurgence of COVID, but I sat transfixed in that nearly empty theater as Bogie’s distraught, drunken Rick Blaine famously raged that “Of all the gin joints in all the towns in all the world, she walks into mine!” He is, of course, lamenting his earlier unexpected encounter with old flame Ilsa Lund, splendidly portrayed with a sadness indelibly etched upon her beautiful countenance by Ingrid Bergman, who with Bogart led the credits of a magnificent ensemble cast that also included Paul Henreid, Claude Rains, Conrad Veidt, Sydney Greenstreet, and Peter Lorre. But Bogie remains the central object of that universe; the plot and the players in orbit about him. There’s no doubt that without Bogart, there could never have been a Casablanca as we know it. Such a movie might have been made, but it could hardly have achieved a greatness on this order of magnitude.
Bogie never actually uttered the signature line “Play it again, Sam,” so closely identified with the production (and later whimsically poached by Woody Allen for the title of his iconic 1972 comedy peppered with clips from Casablanca). And although the film won Academy Awards for Best Picture and Best Director, as well as in almost every other major category, Bogart was nominated but missed out on the Oscar, which instead went to Paul Lukas—does anyone still remember Paul Lukas?—for his role in Watch on the Rhine. This turns out to be a familiar story for Bogart, who struggled with a lifelong frustration at typecasting, miscasting, studio manipulation, lousy roles, inadequate compensation, missed opportunities, and repeated snubs—public recognition of his talent and star-quality came only late in life and even still frequently eluded him, as on that Oscar night. He didn’t really expect to win, but we can yet only wonder at what Bogart must have been thinking . . . He was already forty-four years old on that disappointing evening when the Academy passed him over. There was no way he could have known that most of his greatest performances would lie ahead, that after multiple failed marriages (one still unraveling that very night) a young starlet he had only just met would come to be the love of his life and mother of his children, and that he would at last achieve not only the rare brand of stardom reserved for just a tiny slice of the top tier in his profession, but that he would go on become a legend in his own lifetime and well beyond it: the epitome of the cool, tough, cynical guy who wears a thin veneer of apathy over an incorruptible moral center, women swooning over him as he stares down villains, an unlikely hero that every real man would seek to emulate.
My appreciation of Casablanca and its star in this grand cinema setting was enhanced by the fact that I was at the time reading Bogart (1997), by A.M. Sperber & Eric Lax, which is certainly the definitive biography of his life. I was also engaged in a self-appointed effort to watch as many key Bogie films in roughly chronological order as I could while reading the bio, which eventually turned out to be a total of twenty movies, from his first big break in The Petrified Forest (1936) to The Harder They Fall (1956), his final role prior to his tragic, untimely death at fifty-seven from esophageal cancer.
Bogie’s story is told brilliantly in this unusual collaboration by two authors who had never actually met. Ann Sperber, who wrote a celebrated biography of journalist Edward R. Murrow, spent seven years researching Bogart’s life and conducted nearly two hundred interviews with those who knew him most intimately before her sudden death in 1994. Biographer Eric Lax stepped in and shaped her draft manuscript into a coherent finished product that reads seamlessly like a single voice. I frequently read biographies of American presidents not only to study the figure that is profiled, but because the very best ones serve double duty as chronicles of United States history, the respective president as the focal point. I looked to the Bogart book for something similar, in this case a study of Old Hollywood with Bogie in the starring role. I was not to be disappointed.
Humphrey DeForest Bogart was born on Christmas Day 1899 in New York City to wealth and privilege, with a father who was a cardiopulmonary surgeon and a mother who was a commercial illustrator. Both parents were distant and unaffectionate. They had an apartment on the Upper West side and a vast estate on Canandaigua Lake in upstate New York, where Bogie began his lifelong love affair with boating. Indifferent to higher education, he eventually flunked out of boarding school and joined the navy. There seems nothing noteworthy about his early life.
His acting career began almost accidentally, and he spent several years on the stage before making his first full-length feature in 1930, Up the River, with his drinking buddy Spencer Tracy, who called him “Bogie.” He was already thirty years old. What followed were largely lackluster roles on both coasts, alternating between Broadway theaters and Hollywood studios. He was frequently broke, drank heavily, and his second marriage was crumbling. Then he won rave reviews as escaped murderer Duke Mantee in The Petrified Forest, playing opposite Leslie Howard on the stage. The studio bought the rights, but characteristically for Bogie, they did not want to cast him to reprise his role, looking instead for an established actor, with Edward G. Robinson at the top of the list. Then Howard, who had production rights, stepped in to demand Bogart get the part. The 1936 film adaptation of the play, which also featured a young Bette Davis, channeled Bogart’s dark and chillingly realistic portrayal of a psychopathic killer—in an era when gangsters like Dillinger and Pretty Boy Floyd dominated the headlines—and made Bogie a star.
But again he faced a series of let-downs. This was the era of the studio system, with actors used and abused by big shots like Jack Warner, who locked Bogart into a low-paid contract that tightly controlled his professional life, casting him repeatedly in virtually interchangeable gangster roles in a string of B-movies. It wasn’t until 1941, when he played Sam Spade in The Maltese Falcon—quintessential film noir as well as John Huston’s directorial debut—that Bogie joined the ranks of undisputed A-list stars and began the process of taking revenge on the studio system by commanding greater compensation and demanding greater control of his screen destiny. But in those days, despite his celebrity, that remained an uphill battle.
I began watching his films while reading the bio as a lark, but it turned out to be an essential assignment: you can’t read about Bogie without watching him. Many of the twenty that I screened I had seen before, some multiple times, but others were new to me. I was raised by my grandparents in the 1960s with a little help from a console TV in the livingroom and all of seven channels delivered via rooftop antenna. When cartoons, soaps, and prime time westerns and sitcoms weren’t broadcasting, the remaining airtime was devoted to movies. All kinds of movies, from the dreadful to the superlative and everything in-between, often on repeat. Much of it was classic Hollywood and Bogart made the rounds. One of my grandfather’s favorite flicks was The Treasure of the Sierra Madre, and I can recall as a boy watching it with him multiple times. In general, he was a lousy parent, but I am grateful for that gift; it remains among my top Bogie films. We tend to most often think of Bogart as Rick Blaine or Philip Marlowe, but it is as Fred C. Dobbs in The Treasure of the Sierra Madre and Charlie Allnutt in The African Queen and Captain Queeg in The Caine Mutiny that the full range of his talent is revealed.
It was hardly his finest role or his finest film, but it was while starring as Harry Morgan in To Have and Have Not (1944) that Bogie met and fell for his co-star, the gorgeous, statuesque, nineteen-year-old Lauren Bacall—twenty-five years younger than him—spawning one of Hollywood’s greatest on-screen, off-screen romances. They would be soulmates for the remainder of his life, and it was she who brought out the very best of him. Despite his tough guy screen persona, the real-life Bogie tended to be a brooding intellectual who played chess, was well-read, and had a deeply analytical mind. An expert sailor, he preferred boating on the open sea to carousing in bars, although he managed to do plenty of both. During crackdowns on alleged communist influence in Hollywood, Bogart and Bacall together took controversial and sometimes courageous stands against emerging blacklists and the House Un-American Activities Committee (HUAC). But he also had his flaws. He could be cheap. He could be a mean drunk. He sometimes wore a chip on his shoulder carved out of years of frustration at what was after all a very slow rise to the top of his profession. But warts and all, far more of his peers loved him than not.
Bogart is a massive tome, and the first section is rather slow-going because Bogie’s early life was just so unremarkable. But it holds the reader’s interest because it is extremely well-written, and it goes on to succeed masterfully in spotlighting Bogart’s life against the rich fabric that forms the backdrop of that distant era of Old Hollywood before the curtains fell for all time. If you are curious about either, I highly recommend this book. If you are too busy for that, at the very least carve out some hours of screen time and watch Bogie’s films. You will not regret the time spent. Although his name never gets dropped in the lyrics by Ray Davies for the familiar Kinks tune, if there were indeed Celluloid Heroes, the greatest among them was certainly Humphrey Bogart.
NOTE: These are Bogart films I screened while reading this book:
Historians consistently rank him at the top, tied with Washington for first place or simply declared America’s greatest president. His tenure was almost precisely synchronous with the nation’s most critical existential threat: his very election sparked secession, first shots fired at Sumter a month after his inauguration, the cannon stilled at Appomattox a week before his murder. There were still armies in the field, but he was gone, replaced by one of the most sinister men to ever take the oath of office, leaving generations of his countrymen to wonder what might have transpired with all the nation’s painful unfinished business had he survived, to the trampled hopes for equality for African Americans to the promise of a truly “New South” that never emerged. A full century ago, decades after his death, he was reimagined as an enormous, seated marble man with the soulful gaze of fixed purpose, the central icon in his monument that provokes tears for so many visitors that stand in awe before him. When people think of Abraham Lincoln, that’s the image that usually springs to mind.
The seated figure rises to a height of nineteen feet; somebody calculated that if it stood up it would be some twenty-eight feet tall. The Lincoln that once walked the earth was not nearly that gargantuan, but he was nevertheless a giant in his time: physically, intellectually—and far too frequently overlooked—politically! He sometimes defies characterization because he was such a character, in so very many ways.
An autodidact gifted with a brilliant analytical mind, he was also a creature of great integrity loyal to a firm sense of a moral center that ever evolved when polished by new experiences and touched by unfamiliar ideas. A savvy politician, he understood how the world worked. He had unshakeable convictions, but he was tolerant of competing views. He had a pronounced sense of empathy for others, even and most especially his enemies. In company, he was a raconteur with a great sense of humor given to anecdotes often laced with self-deprecatory wit. (Lincoln, thought to be homely, when accused in debate of being two-faced, self-mockingly replied: “I leave it to my audience. If I had another face, do you think I’d wear this one?”) But despite his many admirable qualities, he was hardly flawless. He suffered with self-doubt, struggled with depression, stumbled through missteps, burned with ambition, and was capable of hosting a mean streak that loomed even as it was generally suppressed. More than anything else he had an outsize personality.
And Lincoln likewise left an outsize record of his life and times! So why has he generally posed such a challenge for biographers? Remarkably, some 15,000 books have been written about him—second, it is said, only to Jesus Christ—but yet in this vast literature, the essence of Lincoln again and again somehow seems out of reach to his chroniclers. We know what he did and how he did it all too well, but portraying what the living Lincoln must have been like has remained frustratingly elusive in all too many narratives. For instance, David Herbert Donald’s highly acclaimed bio—considered by many the best single volume treatment of his life—is indeed impressive scholarship but yet leaves us with a Lincoln who is curiously dull and lifeless. Known for his uproarious banter, the guy who joked about being ugly for political advantage is glaringly absent in most works outside of Gore Vidal’s Lincoln, which superbly captures him but remains, alas, a novel not a history.
All that changed with A Self-Made Man: The Political Life of Abraham Lincoln Vol. I, 1809–1849, by Sidney Blumenthal (2016), an epic, ambitious, magnificent contribution to the historiography that demonstrates not only that despite the thousands of pages written about him there still remains much to say about the man and his times, but even more significantly that it is possible to brilliantly recreate for readers what it must have been like to engage with the flesh and blood Lincoln. This is the first in a projected four-volume study (two subsequent volumes have been published to date) that—as the subtitle underscores—emphasize the “political life” of Lincoln, another welcome contribution to a rapidly expanding genre focused upon politics and power, as showcased in such works as Jon Meacham’s Thomas Jefferson: The Art of Power, Robert Dallek’s Franklin D. Roosevelt: A Political Life, and George Washington: The Political Rise of America’s Founding Father, by David O. Stewart.
At first glance, this tactic might strike as surprising, since prior to his election as president in 1860 Lincoln could boast of little in the realm of public office beyond service in the Illinois state legislature and a single term in the US House of Representatives in the late 1840s. But, as Blumenthal’s deeply researched and well-written account reveals, politics defined Lincoln to his very core, inextricably manifested in his life and character from his youth onward, something too often disregarded by biographers of his early days. It turns out that Lincoln was every bit a political animal, and there is a trace of that in nearly every job he ever took, every personal relationship he ever formed, and every goal he ever chased.
This approach triggers a surprising epiphany for the student of Lincoln. It is as if an entirely new dimension of the man has been exposed for the first time that lends new meaning to words and actions previously treated superficially or—worse—misunderstood by other biographers. Early on, Blumenthal argues that Donald and others have frequently been misled by Lincoln’s politically crafted utterances that cast him as marked by passivity, too often taking him at his word when a careful eye on the circumstances demonstrates the exact opposite. In contrast, Lincoln, ever maneuvering, if quietly, could hardly be branded as passive [p9]. Given this perspective, the life and times of young Abe is transformed into something far richer and more colorful than the usual accounts of his law practice and domestic pursuits. In another context, I once snarkily exclaimed “God save us from The Prairie Years” because I found Lincoln’s formative period—and not just Sandburg’s version of it—so uninteresting and unrelated to his later rise. Blumenthal has proved me wrong, and that sentiment deeply misplaced.
But Blumenthal not only succeeds in fleshing out a far more nuanced portrait of Lincoln—an impressive accomplishment on its own—but in the process boldly sets out to do nothing less than scrupulously detail the political history of the United States in the antebellum years from the Jackson-Calhoun nullification crisis onward. Ambitious is hardly an adequate descriptive for the elaborate narrative that results, a product of both prodigious research and a very talented pen. Scores of pages—indeed whole chapters—occur with literally no mention of Lincoln at all, a striking technique that is surprisingly successful; while Lincoln may appear conspicuous in his absence, he is nevertheless present, like the reader a studious observer of these tumultuous times even when he is not directly engaged, only making an appearance when the appropriate moment beckons. As such, A Self-Made Man is every bit as much a book of history as it is biography, a key element to the unstated author’s thesis: that it is impossible to truly get to know Lincoln—especially the political Lincoln—except in the context and complexity of his times, a critical emphasis not afforded in other studies.
And there is much to chronicle in these times. Some of this material is well known, even if until recently subject to faulty analysis. The conventional view of the widespread division that characterized the antebellum period centered on a sometimes-paranoid south on the defensive, jealous of its privileges, in fear of a north encroaching upon its rights. But in keeping with the latest historiography, Blumenthal deftly highlights how it was that, in contrast, the slave south—which already wielded a disproportionate share of national political power due to the Constitution’s three-fifths clause that inflated its representation—not only stifled debate on slavery but aggressively lobbied for its expansion. And just as a distinctly southern political ideology evolved its notion of the peculiar institution from the “wolf by the ear” necessary evil of Jefferson’s time to a vaunted hallmark of civilization that boasted benefit to master and servant, so too did it come to view the threat of separation less in dread than anticipation. The roots of all that an older Lincoln would witness severing the ancient “bonds of affection” of the then no longer united states were planted in these, his early years.
Other material is less familiar. Who knew how integral to Illinois politics—for a time—was the cunning Joseph Smith and his Mormon sect? Or that Smith’s path was once entangled with the budding career of Stephen A. Douglas? Meanwhile, the author sheds new light on the long rivalry between Lincoln and Douglas, which had deep roots that went back to the 1830s, decades before their celebrated clash on the national stage brought Lincoln to a prominence that finally eclipsed Douglas’s star.
Blumenthal’s insight also adeptly connects the present to the past, affording a greater relevance for today’s reader. He suggests that the causes of the financial crisis of 2008 were not all that dissimilar to those that drove the Panic of 1837, but rather than mortgage-backed securities and a housing bubble, it was the monetization of human beings as slave property that leveraged enormous fortunes that vanished overnight when an oversupply of cotton sent market prices plummeting, which triggered British banks to call in loans on American debtors—a cotton bubble that burst spectacularly (p158-59). This point can hardly be overstated, since slavery was not only integral to the south’s economy, but by the eve of secession human property was to represent the largest single form of wealth in the nation, exceeding the combined value of all American railroads, banks, and factories. A cruel system that assigned values to men, women, and children like cattle had deep ramifications not only for masters who acted as “breeders” in the Chesapeake and markets in the deep south, but also for insurance companies in Hartford, textile mills in Lowell, and banks in London.
Although Blumenthal does not himself make this point, I could detect eerie if imperfect parallels to the elections of 2016 and 1844, with Lincoln seething as the perfect somehow became the enemy of the good. In that contest, Whig Henry Clay was up against Democrat James K. Polk. Both were slaveowners, but Clay opposed the expansion of slavery while Polk championed it. Antislavery purists in New York rejected Clay for the tiny Liberty Party, which by a slender margin tipped the election to Polk, who then boosted the slave power with Texas annexation, and served as principal author of the Mexican War that added vast territories to the nation, setting forces in motion that later spawned secession and Civil War. Lincoln was often prescient, but of course he could not know all that was to follow when, a year after Clay’s defeat, he bitterly denounced the “moral absolutism” that led to the “unintended tragic consequences” of Polk’s elevation to the White House (p303). To my mind, there was an echo of this in the 2016 disaster that saw Donald Trump prevail, a victory at least partially driven by those unwilling to support Hillary Clinton who—despite the stakes—threw away their votes on Jill Stein and Gary Johnson.
No review could properly summarize the wealth of the material contained here, nor overstate the quality of the presentation, which also suggests much promise for the volumes that follow. I must admit that at the outset I was reluctant to read yet another book about Lincoln, but A Self-Made Man was recommended to me by no less than historian Rick Perlstein, (author of Nixonland), and like Perlstein, Blumenthal’s style is distinguished by animated prose bundled with a kind of uncontained energy that frequently delivers paragraphs given to an almost breathless exhale of ideas and people and events that expertly locates the reader at the very center of concepts and consequences. The result is something exceedingly rare for books of history or biography: a page-turner! Whether new to studies of Lincoln or a long-time devotee, this book should be required reading.
NOTE: A review of one of Rick Perlstein’s books is here:
As the COVID-19 pandemic swept the globe in 2020, it left in its wake the near-paralysis of many hospital systems, unprepared and unequipped for the waves of illness and death that suddenly overwhelmed capacities for treatment that were after all at best only palliative care, since for this deadly new virus there was neither a cure nor a clear route to prevention. Overnight, epidemiologists—scrambling for answers or even just clues—became the most critically significant members of the public health community, even if their informed voices were often shouted down by the shriller ones of media pundits and political hacks.
Meanwhile, data collection began in earnest and the number of data dashboards swelled. In the analytical process, the first stop was identifying the quality of the data and the disparities in how data was collected. Was it true, as some suggested, that a disproportionate number of African Americans were dying from COVID? At first, there was no way to know since some states were not collecting data broken down by this kind of specific demographic. Data collection eventually became more standardized, more precise, and more reliable, serving as a key ingredient to combat the spread of this highly contagious virus, as well as one of the elements that guided the development of vaccines. Even so, dubious data and questionable studies too often took center stage both at political rallies and in the media circus that echoed a growing polarization that had one side denouncing masks, resisting vaccination, and touting sideshow magic bullets like Ivermectin. But talking heads and captive audiences aside, masks reduce infection, vaccines are effective, and dosing with Ivermectin is a scam. How do we know that? Data. Mostly due to data. Certainly, other key parts of the mix include scientists, medical professionals, case studies, and peer reviewed papers, but data—first collected and then analyzed—is the gold standard, not only for COVID but for all disease treatment and prevention.
But it wasn’t always that way.
In the beginning, there was no such thing as epidemiology. Disease causes and treatments were anecdotal, mystical, or speculative. Much of the progress in science and medicine that was the legacy of the classical world had long been lost to the west. The dawn of modern epidemiology rose above a horizon constructed of data painstakingly collected and compiled and subsequently analyzed. In fact, certain aspects of the origins of epidemiology were to run concurrent with the evolution of statistical analysis. In the early days, as the reader comes to learn in this brilliant and groundbreaking 2021 work by historian Jim Downs, Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine, the bulk of the initial data was derived from unlikely and unwilling participants who existed at the very margins: the enslaved, the imprisoned, the war-wounded, and the destitute condemned to the squalor of public hospitals. Their identities are mostly forgotten, or were never recorded in the first place, but yet collectively the data harvested from them was to provide the skeletal framework for the foundation of modern medicine.
In a remarkable achievement that could hardly be more relevant today, the author cleverly locates Maladies of Empire at the intersection of history and medicine, where data collection from unexpected and all too frequently wretched subjects comes to form the very basis of epidemiology itself. It is these early stories that send shudders to a modern audience. Nearly everyone is familiar with the wrenching 1787 diagram of the lower deck of the slave ship Brookes, where more than four hundred fifty enslaved human beings were packed like sardines for a months-long voyage, which became an emblem for the British antislavery movement. But, as Downs points out, few are aware that the sketch can be traced to the work of British naval surgeon Dr. Thomas Trotter, one of the first to recognize that poor ventilation in crowded conditions results in a lack of oxygen that breeds disease and death. His observations also led to a better understanding of how to prevent scurvy, a frequent cause of higher mortality rates among the seaborne citrus-deprived. Trotter himself was appalled by the conditions he encountered on the Brookes, and testified to this before the House of Commons. But that was hardly the case for many of his peers, and certainly not for the owners of slave ships, who looked past the moral dilemmas of a Trotter while exceedingly grateful for his insights; after all, the goal was keep larger quantities of their human cargo alive in order to turn greater profits. Dead slaves lack market value.
A little more than three decades prior to Trotter’s testimony, the critical need for ventilation was documented by another physician in the wake of the confinement of British soldiers in the infamous “Black Hole of Calcutta” during the revolution in Bengal, which resulted in the death by suffocation of the majority of the captives. Downs makes the point that one of the unintended consequences of colonialism was that for early actors in the medical arena it served to vastly extend the theater of observation of the disease-afflicted to a virtually global stage that hosted the byproducts of colonialism: war, subjugated peoples, the slave trade, military hospitals and prisons. But it turns out that the starring roles belong less to the doctors and nurses that receive top billing in the history books than to the mostly uncredited bit players removed from the spotlight: the largely helpless and disadvantaged patients whose symptoms and outcomes were observed and cataloged, whose anonymous suffering translated into critical data that collectively advanced the emerging science of epidemiology.
Traditionally, history texts rarely showcased notable women, but one prominent exception was Florence Nightingale, frequently extolled for her role as a nurse during the Crimean War. But as underscored in Maladies of Empire, Nightingale’s real if often overlooked legacy was as a kind of disease statistician through her painstaking data collection and analysis—the very basis for epidemiology that was generally credited to white men rather than to “women working in makeshift hospitals.” [p111] But it was the poor outcomes for patients typically subjected to deplorable conditions in these makeshift military hospitals—which Nightingale assiduously observed and recorded—that drew attention to similarly appalling environments in civilian hospitals in England and the United States, which led to a studied analysis that eventually established systematic evidence for the causes, spread, and treatment of disease.
The conclusions these early epidemiologists reached were not always accurate. In fact, they were frequently wrong. But Downs emphasizes that what was significant was the development of the proper analytical framework. In these days prior to the revolutionary development of germ theory, notions on how to improve survival rates of the stricken put forward by Nightingale and others were controversial and often contradictory. Was the best course quarantine, a frequent resort? Or would improving the sickbed conditions, as Nightingale advocated, lead to better outcomes? Unaware of the role of germs in contagion, evidence could be both inconclusive and inconsistent, and competing ideas could each be partly right. After all, regardless of how disease spread, cleaner and better ventilated facilities might lead to lower mortality rates. Nightingale stubbornly resisted germ theory, even as it was widely adopted, but after it won her grudging acceptance, she continued to promote more sanitary hospital conditions to improve survival rates. Still, epidemiologists faced difficult challenges with diseases that did not conform to familiar patterns, such as cholera, spread by a tainted water supply, and yellow fever, a mosquito-borne pathogen.
In the early days, as noted, European observers collected data from slave ships, yet it never occurred to them that because their human subjects were black such evidence was not applicable to the white population. But epidemiology took a surprisingly different course in the United States, where race has long proved to be a defining element. Of the more than six hundred thousand who lost their lives during the American Civil War, about two-thirds were felled not by bullets but by disease. The United States Sanitary Commission (USSC) was established in an attempt to ameliorate these dreadful outcomes, but its achievements on one hand were undermined on the other by an obsession with race, even going so far as the sending out to “. . . military doctors a questionnaire, ‘The Physiological Status of the Negro,’ whose questions were based on the belief that Black soldiers were innately different from white soldiers . . . The questionnaire also distinguished gradations of color among Black soldiers, asking doctors to compare how ‘pure Negroes’ differed from people of ‘mixed races’ and to describe ‘the effects of amalgamation on the vital endurance and vigor of the offspring.’” With its imprimatur of governmental authority, the USSC officially championed scientific racism, with profound and long-term social, political, and economic consequences for African Americans. [p134-35]
Some of these notions can be traced back to the antebellum musings of Alabama surgeon Josiah Nott—made famous after the war when he correctly connected mosquitoes to the etiology of Yellow Fever—who asserted that blacks and whites were members of separate species whose mixed-race offspring he deemed “hybrids” who were “physiologically inferior.” Nott believed that all three of these distinct “types” responded differently to disease. [p124-25] His was but one manifestation of the once widespread pseudoscience of physiognomy that alleged black inferiority in order to justify first slavery and later second-class citizenship. Such ideas persisted for far too long, and although scientific racism still endures on the alt-right, it has been thoroughly discredited by actual scientists. It turns out that a larger percentage of African Americans did indeed succumb to death in the still ongoing COVID pandemic, but this has been shown to be due to factors of socioeconomic status and lack of access to healthcare, not genetics.
Still, although deemed inferior, enslaved blacks also proved useful when convenient. The author argues that “… enslaved children were most likely used as the primary source of [smallpox] vaccine matter in the Civil War South,” despite the danger of infection in harvesting lymph from human subjects in order to vaccinate Confederate soldiers in the field. In yet one more reminder of the moral turpitude that defined the south’s “peculiar institution,” the subjects also included infants whose resulting scar or pit, Downs points out, “. . . would last a lifetime, indelibly marking a deliberate infection of war and bondage. Few, if any, knew that the scars and pit marks actually disclosed the infant’s first form of enslaved labor, an assignment that did not make it into the ledger books or the plantation records.” [p141-42]
Tragically, this episode was hardly an anomaly, and unethical medical practices involving blacks did not end with Appomattox. The infamous “Tuskegee Syphilis Study” that observed but failed to offer treatment to the nearly four hundred black men recruited without informed consent ran for forty years and was not terminated until 1972! One of the chief reasons for COVID vaccine hesitancy among African Americans has been identified as a distrust of a medical community that historically has either victimized or marginalized them.
Maladies of Empire is a well-written, highly readable book suitable to a scholarly as well as popular audience, and clearly represents a magnificent contribution to the historiography. But it is hardly only for students of history. Instead, it rightly belongs on the shelf of every medical professional practicing today—especially epidemiologists!
Is your morning coffee moving? Is there a particle party going on in your kitchen? What makes for a great-tasting gourmet meal? Does artificial flavoring really make a difference? Why does mixing soap with water get your dishes clean? Why do some say that “sitting is the new smoking?” How come one beer gives you a strong buzz but your friend can drink a bottle of wine without slurring her words? When it comes to love, is the “right chemistry” just a metaphor? And would you dump your partner because he won’t use fluoridated toothpaste?
All this and much more makes for the delightful conversation packed into Chemistry for Breakfast: The Amazing Science of Everyday Life, by Mai Thi Nguyen-Kim, a fun, fascinating, and fast-moving slender volume that could very well turn you into a fan of—of all things—chemistry! This cool and quirky book is just the latest effort by the author—a real-life German chemist who hosts a YouTube channel and has delivered a TED Talk—to combat what she playfully dubs “chemism:” the notion that chemistry is dull and best left to the devices of boring nerdy chem-geeks! One reason it works is because Nguyen-Kim is herself the antithesis of such stereotypes, coming off in both print and video as a hip, brilliant, and articulate young woman with a passion for science and for living in the moment.
I rarely pick up a science book, but when I do, I typically punch above my intellectual weight, challenging myself to reach beyond my facility with history and literature to dare to tangle with the intimidating realms of physics, biology, and the like. I often emerge somewhat bruised but with the benefit of new insights, as I did after my time with Sean Carroll’s The Particle at the End of the Universe and Bill Schopf’s Cradle of Life. So it was with a mix of eagerness and trepidation that I approached Chemistry for Breakfast.
But this proved to be a vastly different experience! Using her typical day as a backdrop—from her own body’s release of stress hormones when the alarm sounds to the way postprandial glasses of wine mess with the neurotransmitters of her guests—Nguyen-Kim demonstrates the omnipresence of chemistry to our very existence, and distills its complexity into bite-size concepts that are easy to process but yet never dumbed-down. Apparently, there is a particle party going on in your kitchen every morning, with all kinds of atoms moving at different rates in the coffee you’re sipping, the mug in your hand, and the steam rising above it. It’s all about temperature and molecular bonds. In a chapter whimsically entitled “Death by Toothpaste,” we find out how chemicals bond to produce sodium fluoride, the stuff of toothpaste, and why that not only makes for a potent weapon against cavities, but why the author’s best buddy might dump her boyfriend—because he thinks fluoride is poison! There’s much more to come—and it’s still only morning at Mai’s house …
As a reader, I found myself learning a lot about chemistry without studying chemistry, a remarkable achievement by the author, whose technique is so effective because it is so unique. Fielding humorous anecdotes plucked from everyday existence, Mai’s wit is infectious, so the “lessons” prove entertaining without turning silly. I love to cook, so I especially welcomed her return to the kitchen in a later chapter. Alas, I found out that while I can pride myself on my culinary expertise, it all really comes down to the way ingredients react with one another in a mixing bowl and on the hot stove. Oh, and it turns out that despite the fearmongering in some quarters, most artificial flavors are no better or worse than natural ones. Yes, you should read the label—but you have to know what those ingredients are before you judge them healthy or not.
Throughout the narrative, Nguyen-Kim conveys an attractive brand of approachability that makes you want to sit down and have a beer with her, but unfortunately she can’t drink: Mai, born of Vietnamese parents, has inherited a gene mutation in common with a certain segment of Asians which interferes with the way the body processes alcohol, so she becomes overly intoxicated after just a few sips of any strong drink. She explains in detail why her “broken” ALDH2 enzyme simply will not break down the acetaldehyde in the glass of wine that makes her guests a little tipsy but gives her nausea, a rapid-heartbeat, and sends a “weird, lobster-red tinge” to her face. Mai’s issue with alcohol reminded me of recent studies that revealed the reason that some people of northern European ancestry always burn instead of tan at the beach is due to faulty genes that block the creation of melanin in response to sun exposure. This is a strong underscore that while race is of course a myth that otherwise communicates nothing of importance about human beings, in the medical world genetics has the potential of serving as a powerful tool to explain and treat disease. As for Mai, given the overall health risks of alcohol consumption, she views her inability to drink as more of a blessing than a curse, and hopes to pass her broken gene on to her offspring!
The odds that I would ever deliberately set out to read a book about chemistry were never that favorable. That I would do so and then rave about the experience seemed even more unlikely. But here we are, along with my highest recommendations. Mai’s love of science is nothing less than contagious. If you read her work, I can promise that not only will you learn a lot, but you will really enjoy the learning process. And that too, I suppose, is chemistry!
[Note: I read an Advance Reader’s Copy of this book as part of an early reviewer’s program]