Category Archives: English writing

The Election and the War on the West

The Democrats are still reeling from their crushing defeat––a shellacking, as Obama might have put it––in this year’s presidential election. Eight years ago, when they first lost to the MAGA movement, I received the news with shock, anger, and sorrow.  This time around, I was shocked not by the election result, but by the fact that so many Democrats were seemingly caught off guard again, by what appeared to be a rather predictable outcome.

I don’t claim to have any above-average understanding of American politics.  All that I did was read Wall Street Journal, pay attention to Poly Market, and listen to podcasts like Honestly with Barry Weiss, All In, Megyn Kelly Show and Joe Rogan.  That was enough for me to conclude, several weeks ahead of the election, that Trump was going to win easily despite all polls said otherwise.

What is even more surprising is that the Democrats still cannot agree on why they lost so badly. According to Nancy Pelosi and Rachel Maddow, their party did nothing wrong.  They lost simply because they were up against a global anti-incumbent wave set in motion by pandemic-induced inflation.

Bernie Sanders begs to differ. In a scathing post-election statement, he scolded the Democratic Party for abandoning working class people and attributed its humiliating loss to their mass defection.

Biden’s age and ego were frequently cited as another culprit: had he not attempted to run again and allowed a proper primary to run its natural course, the liberals might have rallied around a stronger candidate than the hastily anointed Kamala Harris.

Others grudgingly conceded that the Biden administration has misread and mishandled the immigration crisis at the border.  While Trump’s lie about “dog-eating-aliens” was debunked and ridiculed, the truth is he succeeded in keeping the spotlight on an issue that progressives struggled to defend.   In the end, even the sanctuary cities in blue states had lost their appetite for more migrants arriving on the buses sent by the governors of the border states.

What else?

Interestingly, many Democrats become defensive on culture issues, especially when “wokeism” or “DEI” was cited to explain their defeat.     In a recent episode of The Daily Show, Jon Stewart got into a testy argument with his guest about the role DEI may or may not have played in the election.  John Oliver, a Stewart disciple, similarly pushed back against this theory on his well-regarded Last Week Tonight Show.  For full disclosure, I am a big fan of both men. However, I find their dismissive attitude unconvincing and unhelpful.

Like it or not, many conservatives regularly discuss culture issues in apocalyptic terms. Should the liberals at least try to sympathize with their concerns and emotions, if not meet them halfway?

When I watched Elon Musk’s stump speeches at Trump rallies––I know, I am not supposed to––I immediately noticed the man’s fixation on culture issues. This election, he often told the audience in an uncharacteristically solemn voice, is our last chance to save Western Civilization.  Many people would find this proclamation preposterous.  Isn’t Trump supposed to be the greatest threat to our democracy, the crown jewel of the Western Civilization? Has Musk really gone crazy, as alleged in a popular pre-election sticker liberals rushed to put on their Teslas?  If he has, then madness must have infected many others.  Liz Truss, a former British Prime Minister, used very similar language in a recent Wall Street Journal Opinion piece.  “Mr. Trump,” she wrote, “can do more than end wokeism and kickstart the American economy: he can save the West.”

If you have read Douglas Murray’s The War on the West, you would better understand where this sentiment comes from.  Murray describes a civilization under attack from within and without, yet few in the West see the eminent and grave danger.   The book is meant to be a rallying cry, a logical prelude to fighting back, now signified by Trump’s resounding electoral victory––I suppose that’s how most Trump supporters, Musk and Truss included, read it.

The War On the West is first and foremost a culture war.

On one side of the battleground stands the Western canon, which prides itself on its profound contributions to philosophy, science, literature, and the arts. Through the Enlightenment movement and the Industrial Revolutions, the canon has brought sustained economic growth, extraordinary prosperity, and human flourishing. Thanks to these accomplishments, Western civilization has dominated the world for centuries—politically, militarily, economically, and culturally. Seen from the vantage point of the West, humanity has ascended to an unprecedented height under its hegemony, and the ascent still shows no sign of abating.

This conventional wisdom, however, has been relentlessly challenged by an anti-Western intelligentsia since the end of World War II—led by authors like Jean-Paul Satre, Michel Foucault, and Edward Said.  Murray conceded that the rise of anti-Westernism was an inevitable correction to a prolonged and repressive colonial order.  However, that correction quickly turned into an overcorrection and, in recent years, has deteriorated  to a full-blown assault—not on the misdeeds and atrocities of the imperialist Western empires of the past, but on Western civilization as a whole.  In the mind of these anti-Western warriors, the hegemonic culture of the West is fundamentally racist, greedy, power-hungry, hypocritical and sometimes genocidal.  It would never voluntarily abdicate power and control on other peoples and civilizations.  This might sound like hyperbole.  But how else could one make sense of the famous chant led by the Reverend Jesse Jackson, an American civil rights icon, “Hey hey, ho ho, Western Civ has to go?”

In fact, Edward Said, who was educated at Princeton and taught at Columbia, described Europeans almost exactly this way.   “Every European,” he wrote in his masterpiece Orientalism, “in what he could say about the Orient, was consequently a racist, an imperialist and almost totally ethnocentric.”   Regarding the work of Michel Foucault, another towering intellectual of post-colonial studies, Murray has this to say:

Taken in its totality, his work is one of the most sustained attempts to undermine the system of institutions that had made up part of the Western system of order. His obsessive analysis of everything through a quasi-Marxist lens of power relations diminished almost everything in society into a transactional, punitive and meaningless dystopia.

Thus, to Foucault, Said, and their enthusiastic followers, there is almost nothing good to be said about the West.  Murray rejected this absolutist anti-West sentiment, though much of his defense is built around some form of whataboutism.

Murray points out that racism, often considered an original sin of the West, existed in non-Western societies as well.   For example, many Chinese dialects refer to foreigners as “gui” (Ghost), instead of “ren” (human).  As a Chinese person, I can confirm he was not far off. I would add that the Chinese once used derogative terms for different foreigners: “yangguizi” (foreign ghost) for the Westerns, “xiaoguizi” (little ghost) for Japanese, “bangzi” (stick) for Koreans, A’san (little wretch) for Indians and so on.

Another anecdote mentioned by Murray surprised me. According to him, Kang Youwei, a prominent scholar and reformer during the late Qing Dynasty, argued that white people or “yellow” people should be rewarded if they were willing to marry black people,  because their sacrifice could help “purify humankind.”

Murray acknowledges the enormous pains and sufferings that the transatlantic slave trade inflicted on Africans but insists that the West was not alone in the guilt of perpetuating this ancient and horrific institution. Slave trade was rampant in the Arab World—we know so little about it today only because, according to Murray, the Arabs systematically castrated their slaves.  Brazil and Ottoman Empire continued the slave trade decades after the costly Civil War ended slavery in North America in the early 1860s.  By that time, the British Empire has long outlawed the practice and spent a fortune to police the oceans and to compensate the companies for their lost “assets”—in fact, so much debt was taken on to foot the bill that the British taxpayers did not pay it off until 2015.

Murray also questions “the notion that colonialism is always and everywhere a bad thing.” In fact, many nations that emerged in the postcolonial world failed spectacularly, sometimes subjecting their people to far greater misery than under colonial rule.  Murray even thinks it is unfair to blame the Europeans for “stealing” the Americas from the native peoples, because “the whole history of our species was one of occupation and conquering” until the modern era.  Also, do we really believe American Indians and Aztecs would have fared better if their land were “discovered” by someone else?  These arguments are far from airtight, but they are not complete nonsense either.

Murray is also exasperated by the defamation and purging campaign against the historical figures revered in the West.   In recent years, these efforts have escalated from critiques in books and magazines to violent protests and acts of vandalism.

It has become fashionable on the left to desecrate or destroy the statues of people who have done or said anything judged as incompatible with the latest edition of the progressive code of conduct.

Voltaire was canceled because he had invested in the French East India Company and made a racist comment about Africans in a book.

John Lock was canceled because he owned stock in companies involved in the slave trade.

Thomas Jefferson was cancelled because he not only owed slaves but also impregnated one—that second offense had to be a sexual assault because, evidently, a slave could not give a valid consent.

Even the reputation of Abraham Lincoln, once described by Tolstoy as a man “bigger than his country,” was in serious trouble, partly due to his alleged mistreatment of American Indians.  He also made racist comments, and once advocated for deporting Black people from the United States altogether.

The cancellation that truly sent Murray into a frenzy—given that he is British—was that of Churchill, whose racist worldview was no secret to any student of history.  Churchill must be canceled, Murry writes indignantly,

because as long as his reputation stands, the West still has a hero; (he must be canceled because) they want to kick the “white men,” they want to kick at the great man view of history; and they want to kick at the holiest beings and places of the West.

This culture war has spilled out from the realm of intellectual quarrels into many aspects of social life in the West.

DEI initiatives feature prominently in Murray’s narrative. While few would disagree with their noble objectives, in practice, these programs often conflict with the other values long cherished in the West, such as meritocracy and equal opportunity (rather than equal outcomes). This has led to great confusion about the trade-offs between pursuing equity and rewarding merit.

In certain quarters on the left, the word “merit” itself has acquired a racist connotation. So have any quantitative tools, such as the SAT or GRE, designed to assess merit or produce a ranking in a population.  Suffice it to cite one quote from Ibram X. Kendi, whose radical antiracist writing Murray repudiated repeatedly in his book:

Standardized tests have become the most effective racist weapon ever devised to objectively degrade Black minds and legally exclude their bodies.

Regardless of their professed goals, many DEI programs have been downgraded to a campaign to make all institutions in the nation—political offices, universities, big corporations— “look like its population.”   DEI is considered such an inherent, unequivocal good that its arrival must be hastened.    It is not enough if everyone agrees and strives to achieve it; a great leap forward is needed to make it a reality now, at least in appearance first.  An article published in The New York Times during July 2020—which I read about in Murray’s book—captures this burning ideology perfectly. Its tagline reads: “To Make Orchestras More Diverse, End Blind Auditions.”

The war on the West has become disturbingly close to a direct assault on “whiteness”, including  white people.  In its extreme form, Murray contends, the rhetoric not only bear all the hallmarks of racism, but sounds “protogenocidal.”  If you think he suffers from paranoia, consider the following anecdotes from the book:

  • A New York Times contributing editor claims that whiteness is “a virus that, like other viruses, will not die until there are no bodies left for it to infect.”
  • Arizona Department of Education declared white babies can begin to express racial prejudice when they are only three months old, and at the age of five they “remain strongly biased in favor of whiteness.”
  • Author Robin DiAngelo wrote in White Fragility that “white people were all racist,” and that white people who refute this truism “were simply providing further evidence of their racism.”
  • A mandatory DEI course for Coca-Cola employees suggest they need to be “less white, less arrogant, less certain, less defensive, less ignorant and more humble.”

These words remind me of how the bourgeoisie and landlords were denounced and vilified during the politically engineered mass frenzies in China between 1949 and 1979. The difference is that, at least in theory, the bourgeoisie and landlords could redeem themselves by relinquishing their social status and properties. How can white people convincingly relinquish their whiteness?

Perhaps the worst development of all is the intolerance of different opinions.  In many cases, this intolerance turns to bullying, intimidation, even threats of persecution against anyone who dares to voice support for dissenting voice.  As Murray laments,

It is so often made clear that whether you’re a math teacher or a partner in a vast multinational firm, the cost of raising your head above the parapet can lead to your whole career crashing down around you. And it can happen from asking the simplest of questions, asserting a provable truth, or simply acknowledging a belief that everybody held until the day before yesterday.

For the record, I’m not sure how much Murray has overstated his case here.  However, it does not take that many precedents—and I have heard of many—for most people to learn the lesson and voluntarily shut their mouths. Even if only a fraction of the population finds their freedom of speech infringed with impunity, democracy can suffer a terrible setback, possibly irreversible damage.

If Elon Musk is to be believed, it is this grave concern that compelled him to turn Twitter into X at a considerable financial cost to him personally.  Musk might well be wrong and have even made Twitter much worse, but I find no particularly good reason to doubt his sincerity and motives.

I read The War on the West long before this election, having heard of it on a podcast (either Sam Harris or Bari Weiss). I remember it as a thriller: intense, controversial, but highly informative—an eye-opening experience in some ways. I suspect most Democrats won’t receive the book well—that is, if they can muster the patience to finish it at all. However, it would be a mistake for them to reject such a book out of hand. They ought to read it, if only to crack the mystery that is still haunting them after eight long years: why so many voters cast their presidential ballots for a demagogue who, in their minds, talks so much, knows so little, has so many character flaws and so few moral virtues.

Some Democrats may dismiss The War on the West as yet another conspiracy theory from the right. Many more may conclude, with the usual self-righteousness and condescension, that they are fighting it for the right side of history. But I am convinced if they do not change course and tactics, they will continue to lose this war—and more elections in the years to come.

Marco Nie, Wilmette

November 30th, 2024

What Hillbilly Elegy reveals about J.D. Vance

If J.D. Vance were not a candidate for the US vice presidency in this election cycle, I would never have read his famed memoir by now.  Memoirs are not among my favorite genres, and reading one written by a 30-year-old Yale Law alumnus turned venture capitalist seemed like a waste of time.   Don’t get me wrong—I have no doubt that someone with Vance’s résumé is smart, ambitious, and hard-working, and their life may even be interesting.  However, stories of such prodigies are abundant in this country, thanks to popular culture’s obsession with them. While these successes are well-deserved and respectable, they hardly inspire any curiosity or excitement in me.

Now that Vance is on the ticket for the highest office in the land, paired with the most controversial and divisive politician in generations, his memoir suddenly becomes a window into his inner world—his beliefs, values, and preferences that could profoundly shape the future of this country.    My interest was also piqued because Vance was known for his anti-Trump stances—he famously compared his current running mate to Hitler. Would his book reveal any clues about his 180-degree turn on Trump? Was his change of heart simply political expediency or the result of some sort of epiphany? In any case, I felt this was a book I needed to read, even if I did not want to.

Given my relatively low expectations, Vance’s book was a surprisingly smooth and thought-provoking read.  As a competent writer, he knows how to command the attention of the reader through storytelling.  I was never bored, partly because the lives of hillbillies—white working-class people from rural, mountainous regions of the United States—feel so alien to me. Of course, I’ve heard about the “white working-class,” but never before had I been brought so close to the vivid details of their day-to-day lives.

Vance’s maternal grandparents were from Appalachian Kentucky, which they left for the Midwest at a young age. Vance speculates that his grandmother’s unexpected teenage pregnancy may have hastened their departure. However, they were largely part of a broader wave of Appalachian people migrating to America’s industrial heartland in search of better opportunities.   The young couple settled in Middletown, Ohio, where Vance’s grandfather secured a blue-collar job in the steel industry, which, in those good old days, paid well enough to support a middle-class family.

Vance’s mom was a good student in high school—even the salutatorian, according to the book. However, her life was derailed after she became a single teenage mother.  Following her first unsuccessful marriage, she married a few other men and dated many more, but gave birth to only one more child, the lucky J.D.  The central plot of the book revolves around how young Vance grew up with a mother who led a tragically chaotic life and was rarely able to provide him with anything resembling a normal home.   He never had a stable father figure—the introduction of a new father to the home always brought an escalating series of dramas that ended with the disintegration of the family.  Mostly, the destructive force seemed to come from Mom—at least that’s the impression one gets from reading the book. Here is how Vance described one of the episodes unfolding after Mom moved in with Matt, one of her boyfriends (or husbands, it’s not entirely clear).

Living with Mom and Matt was like having a front-row seat to the end of the world. The fighting was relatively normal by my standards (and Mom’s), but I’m sure poor Matt kept asking himself how and when he’d hopped the express train to crazy town. It was just the three of us in that house, and it was clear to all that it wouldn’t work out. It was only a matter of time.

Vance was deeply troubled by his mother’s “revolving door of father figures”—it must have felt like a disgrace that tainted the honor of the extended family.  He recalled being set off by a Facebook post from a 13-year-old girl pleading with her mother to stop changing boyfriends. Sympathizing with the young girl, Vance lamented,

for seven long years, I just wanted it to stop. I didn’t care so much about the fighting, the screaming, or even the drugs. I just wanted a home, and I wanted to stay there, and I wanted these goddamned strangers to stay the fuck out.

It wasn’t just boyfriends and drugs. Mom once threatened to kill him by crashing the car they were riding in, forcing Vance to flee while she pursued him in a rage. The ordeal ended only when the police came to take her into custody. I paused for a long time after reading about this horrifying event, trying to imagine how I would have coped as an 11-year-old boy in that situation—I’m not sure I would ever fully recover from such trauma.

After the incident, Vance struck a deal with Mom: he lied to the judge to keep her out of jail, and she agreed to let him decide where he wanted to live. In the ensuing years, Vance would live briefly with his biological father, then with his half-sister Lindsay on and off (while his mother was either in treatment centers or otherwise unable to care for them), and finally with his grandma—the Mamaw—after the freshman year in high school.

The constantly shifting family structure and endless domestic violence Vance endured in his youth must have left an indelible mark on his psyche.  Even as an adult, he regularly has nightmares in which Mom is the monster chasing him in a treehouse.   He writes that he “used words as weapons”, because he had to survive in a world where “disagreements were war”.    He had to fight hard to control the “demons” within him, feeling they were “as much an inheritance as his blue eyes and brown hair.”

Sociologists have shown that children experiencing such family instability often face severe developmental challenges.   According to Vance, he would have succumbed to them had it not been for Mamaw and his sister Lindsay, who provided him with a semblance of stability and much-needed emotional and material support when he needed them most.  Mamaw was his savior, protector and hero. Without her, Vance would probably never have made it out of Middletown, let alone earned a J.D. from Yale and become a disciple of Peter Thiel.  Looking back at his high school years, Vance wrote,

Those three years with Mamaw—uninterrupted and alone—saved me. I didn’t notice the causality of the change, how living with her turned my life around. I didn’t notice that my grades began to improve immediately after I moved in.

Yes, the book is about a poor kid achieving the American dream despite the odds stacked against him.  The young author can be forgiven for wanting to brag about it—his achievements do seem like a small miracle when you realize how close he was to complete ruin. However, the book is also about more than that.

Vance tries to generalize his lived experiences—his struggles as well as his triumphs—to those of his neighbors in Middletown, of hillbillies, and more broadly, of the white working class. He notes that many families in these groups faced similar problems. In fact, his grandparents had their fair share of domestic violence and alcoholism.  To help understand the nature of the violence, it is worth noting that Mamaw once tried to kill her husband by literally setting him on fire after he broke his promise to never get drunk again.

Vance describes his communities as a world of “truly irrational behavior.”  Wherever he looked, he saw only desolation, indolence, and cynicism. But who or what is responsible for the predicament of his people?

It appears that Vance has been pondering this question since his teenage years. While the book is, to some extent, an effort to seek answers, it is by no means a formal and comprehensive analysis. Instead, his thoughts are scattered throughout the book, often presented as spontaneous rants inspired by some random anecdotes. His opinions are nuanced—remarkably so for a writer in his thirties.

To Vance, the hillbilly elegy is, above all, an economic story. In the booming postwar era, vibrant communities sprang up around manufacturing centers in what is now America’s infamous Rust Belt. Yet, these communities, heavily reliant on specific well-compensated blue-collar jobs, were inherently fragile and vulnerable to disruption. When those jobs were lost to globalization and technological advancements, workers and their families faced drastic lifestyle adjustments. Those unable to adjust—often people without advanced degrees or resources—became the “truly disadvantaged.” They found themselves trapped in communities where meaningful social support is scarce. These people were Vance’s family, neighbors, classmates, and friends.

As economically disadvantaged as the hillbillies might be, Vance argues that their conditions are further worsened by several cultural and psychological traits.

The first of these traits is the belief that one’s choices and efforts don’t matter. According to Vance, hillbillies often assume those who “make it” are either naturally gifted or born into wealth and influence. In this view, hard work is not nearly as important. Vance, once influenced by this mindset himself, vehemently rejects it. Before joining the Navy, he doubted whether he had what it took to succeed, even as Mamaw insisted he was destined for something great. Only after enduring Marine Corps boot camp and excelling as a military journalist did he realize that he had been consistently “underselling” himself, mistaking a lack of effort for inability.

Vance urges hillbillies to take personal responsibility for their failures and to stop making excuses. A case in point is Mom. Although Vance acknowledges that genetics and upbringing may have contributed to her substance abuse and erratic behavior, he also believes she bears much of the responsibility. No one, he argues, should be granted “a perpetual moral get-out-of-jail-free card.”

Hillbillies share a deep-seated skepticism toward institutions: news media and politicians are seen as incessant liars, and universities, especially elite ones, are believed to be rigged against their children. This distrust reinforces a sense of helplessness and discourages engagement with society. The logic seems clear: if the path forward is blocked by liars and grifters, why try at all?  To his credit, Vance holds modern conservatism accountable for failing its “biggest constituent.” He writes, “Instead of encouraging engagement, conservatives increasingly foment the kind of detachment that has sapped the ambition of so many of my peers.” According to Vance, it’s the message of the right—that “it’s your government’s fault you’re a loser”—that has planted seeds of cynicism and despair in these communities.

Hillbilly families also have a massive parenting problem. Teachers feel powerless to help their students succeed in school because, as one teacher allegedly told Vance, these kids are “raised by wolves” at home. The cause of poor parenting, it seems, has more to do with culture than economics.  Even for those who do live in poverty, their basic material needs—food, clothing, shelter, transportation, and school supplies—are rarely at risk.  Mom always made sure Vance and Lindsay had the “trendiest Christmas gifts,” even if it meant spending money she didn’t have.  And Mom seems not alone in her desire to indulge her children’s craving for extravagant gifts. What seems lacking is a fundamental appreciation for raising kids to become educated, responsible individuals. Their actions ultimately harm the children, but they don’t care enough to change course.  As Vance observed,

We don’t study as children, and we don’t make our kids study when we’re parents. Our kids perform poorly in school. We might get angry with them, but we never give them the tools to succeed.

Are there solutions to the problems in these communities? Vance didn’t think so, especially not in the form “a magical public policy or an innovative government program.”  Public policy can help, he writes, “but there is no government that can fix these problems for us.”  In fact, Vance frequently points out—like a true conservative—how government intervention can make bad problems worse. His greatest frustration appears to be with welfare.   He describes, often with exasperation, instances like a neighbor who has never worked a day in her life but unabashedly complains about other welfare recipients abusing the system; or a jobless, drug-addicted acquaintance who often buys T-bone steaks at a grocer, which Vance could not afford while working part-time at the same grocer.

To Vance, the welfare system not only rewards and perpetuates indolence but also creates resentment among those who work hard to earn an honest living. He argues that welfare is one of the main reasons “Appalachia and the South went from staunchly Democratic to staunchly Republican in less than a generation.” His objections feel passionate and authentic, though a bit ironic, given that both he and Mamaw were once welfare recipients themselves.

Vance urged hillbillies to stop “blaming Obama or Bush or faceless companies” and to start asking themselves what they can do to make things better. But how? Vance admitted he didn’t have answers. However, he did suggest that his people might look to coastal elites—the new friends he made at Yale and in Silicon Valley—as potential role models, because

their children are happier and healthier, their divorce rates lower, their church attendance higher, their lives longer. These people are beating us at our own damned game.

If memory serves me well, the book never mentions Trump by name, so we don’t actually know what Vance thought of him back then.   That said, Vance the VP candidate is no longer the young Silicon Valley investor who wrote Hillbilly Elegy nearly a decade ago.  He has now enthusiastically embraced much of the MAGA agenda. He converted to Catholicism not long ago, reviving his once abandoned career as a devout Chrisitan.  He speaks fondly of government-imposed tariffs as if it is a panacea to the economic plight of the American working class. On social issues, he remains staunchly conservative—pro-life, pro-family, pious and patriotic.  I am sure many progressives find Vance unbearably repulsive: the sleazy, heartless spin of January 6, the adamant opposition to abortion rights, the sexist slur of “childless cat lady”, and the list goes on.   However, if you read Hillbilly Elegy, you can at least understand the origins of his politics and behaviors:  he was trained, as a child, to weaponize words to win petty battles, he longed for families where kids enjoy safety and stability, and he hated women who mistreat their children.

By now, I’ve listened to many of Vance’s interviews, with both friendly and hostile hosts. It’s clear to me that he possesses a talent rare even among politicians: the power of persuasion. His performance at the Vice-Presidential Debate was nothing short of a political masterpiece, a testament to his extraordinary abilities. He was attentive, respectful, articulate, and persuasive, yet he also conveyed a strong sense of fortitude and conviction. His countenance and tone remained steady throughout, projecting a stoic image remarkably mature for his age—I think that is a gift from his troubled upbringing.

Whatever happens next week, I have my fingers crossed that this man may use his political genius for the good of the American people.

Marco Nie, 11/2/2024

A brief history of travel forecasting

David Boyce and Huw Williams are both esteemed transportation scholars, each with their distinct areas of expertise. With their long and distinguished careers closely intertwined with the development of travel forecasting as an intellectual discipline, it is only fitting that they have chosen to write a book about its past, present and future.

I know Professor Boyce well. My master’s advisor, Professor Der-Horng Lee at the National University of Singapore, studied under Boyce while pursuing his Ph.D. at the University of Illinois at Chicago. Lee’s own master’s advisor, Huey-Kuo Chen from Taiwan’s Central University, was also one of Boyce’s doctoral students during his tenure at the University of Illinois at Urbana-Champaign. This means I am either Boyce’s academic grandchild, or great grandchild, depending on how you count. Thanks to Lee I was well aware of this academic lineage even before I came to the U.S. in the early 2000s. When I joined Northwestern University as an Assistant Professor in 2006, Boyce was serving his alma mater by teaching transportation system courses as an adjunct professor. It took me a while to process the surreal news that I would now be a colleague of my academic forebear.

During my first meeting with Boyce in Evanston, IL, I learned about his joint book project with Williams, which had already been in progress for several years. The book was an ambitious and intriguing endeavor aimed at reflecting on the achievements, missteps, and challenges in our field. It was finally published in 2015, nearly 12 years after they began the project. Shortly afterward, I was asked to oversee the translation of the book into Chinese, a project that would take another five years. Through this process, I had to read the book cover to cover — and between the lines — several times, ensuring I understood every word and phrase. It was a time-consuming and occasionally frustrating task, to be sure, but a rewarding learning experience nonetheless. Ultimately, it is this rare opportunity that inspired me to share in this essay what I learned from the book and the insights it brought to light.

You may find a preprint of the paper at ssrn. Also check the following podcast automatically generated by Notebook LM  based on a PDF of the paper I fed to it.

From a Culture of Growth to the Needham Question

I was attracted to A Culture of Growth because I heard the book provides answers to the Needham question (李约瑟难题), namely why China, despite its early and significant achievements in technology, fell so far behind the West during the critical developmental phases of modern science.  Until I opened the book, I didn’t realize it was written by a Northwestern economist, Joel Mokyr, whom a friend in the economics department described as a leading authority on economic history.

Although Mokyr addresses the Needham question extensively in the final chapter, the book is neither motivated by nor primarily focused on that question.  Quite the contrary—if you read the book closely, you can’t miss Mokyr’s dismissal of the question itself. To him, what begs the question isn’t why China—or any other civilization, for that matter—failed to invent modern science, but rather why Europe succeeded. The book is devoted to providing an explanation.

Mokyr’s theory builds on Cardwell’s Law, which states that technological innovation tends to slow down or stagnate once an organization, economy, or civilization reaches a peak accomplishment. The stagnation occurs because the beneficiaries of the status quo become complacent and resist major creative disruptions that could threaten their dominance. Crucially, they often have the power to “suppress further challenges to entrenched knowledge” by either incentivizing would-be challengers to do their biddings or persecuting them as heretics.

How did Europe manage to break the spell of Cardwell’s Law? Mokyr attributed this success to Europe’s “fortunate condition that combined political fragmentation with cultural unity.” This unique environment gave rise to what he called a “Republic of Letters,” a loosely connected federation where intellectuals could freely exchange, contest, refine, and publish ideas across the borders of competing polities. This republic, along with the “market of ideas” it nurtured, rose gradually after the Middle Ages.

Europeans, following Bacon, began to recognize that knowledge could and should be harnessed for society’s material benefit, and that its creation, dissemination, and utilization should be a collective effort.  That is not to say the Republic of Letters was brought about by any concerted effort. Often motivated by the pursuit of lucrative patronage positions, the founding members of the republic sought to build strong reputations among their peers. This motive, in turn, pushed them to support free access to knowledge and uphold the right to challenge any idea, regardless of its origin.

The republic had no inherent hierarchy, except for the one that naturally emerged through fierce but largely free competition for peer recognition, based on a shared understanding of what constitutes merit.   Scholars who rose to the top of the pecking order often did fabulously well for themselves, attracting a “disproportionate amount of fame and patronage.”   They also became recruitment tools for the Republic of Letters and role models for future generations.  Newton was one such superstar whose influence as a model scientist is hard to overstate.  Mokyr wrote of Newton,

he was knighted, elected to Parliament, and became quite wealthy. In 1727 he was given a splendid funeral and interned in a prominent place in Westminster Abbey. Voltaire remarked that he was buried like a well-loved king.

Once the market of ideas took shape, it was sustained by Europe’s favorable geopolitical conditions.   On the one hand, political fragmentation meant that neither scholars nor their patrons could easily monopolize the market of ideas by blocking the entry of potential competitors or buying them off. Incumbents quickly realized that such maneuvers only pushed innovation into the hands of their rivals, ultimately undermining their own competitive advantage.   On the other hand, cultural unity allowed knowledge production and dissemination to benefit from scale. From an economic perspective, scale reduces the fixed costs of production, which is key to profitability and financial viability. It also created a network effect, meaning that scientists could learn from a relatively large pool of peers—standing on the shoulders of many giants, as Newton famously put it.

Mokyr’s “culture of growth” matured during the Enlightenment, an intellectual and cultural movement that promoted the progressive improvement of society through the expansion and application of useful knowledge, while advocating for more inclusive political institutions. In hindsight, it was clear why the Enlightenment played such a pivotal role in Mokyr’s theorization: it was the precursor to the Industrial Revolution, which triggered an unprecedented phase of economic growth that lifted much of humanity above subsistence living standards.

An interesting aspect of Mokyr’s theory is its focus on what he calls cultural entrepreneurs—or thought leaders, in today’s parlance—who played an outsized role in the evolution of the culture of growth. Mokyr believed that useful knowledge was created by “a minute percentage of the population” whose primary occupation is, in Adam Smith’s words, “to think and or to reason” for “the vast multitudes that labour.”  In fact,

 what the large majority of workers and peasants knew or believed mattered little as long as there were enough of them to do what they were told by those who knew more.

While Mokyr’s assessment is supported by historical evidence, I imagine many would find such an unapologetically elitist view of cultural development difficult to accept. For me, it feels almost antithetical: growing up in China, my history and political science teachers repeatedly taught, with absolute certainty, that it was proletariats who, through class struggle, drove societal progress and historical development.

Mokyr’s theory can explain why China experienced a burst of intellectual development during the Spring and Autumn and Warring States periods (770 – 221 BCE).  There are striking similarities between the geopolitical conditions of China in this classical era and those of Europe after the High Middle Ages: numerous relatively small states engaged in intense and perpetual competition for dominance, a vast territory with diverse terrain, and a shared cultural tradition including language, institutions, and faith.  Many cultural entrepreneurs—collectively known as “Hundred Schools of Thought”—emerged at this time and left indelible marks on the Chinese Literary Canon.  Like their European counterparts nearly two millennia later, these intellectuals built their reputations by creating and sharing knowledge, and when opportunities arose, they happily crossed the borders of rivalry states to seek more profitable employment for their skills.

As mentioned earlier, in Europe, the greatest achievement of the market of ideas was the Enlightenment. In China, however, a similar market of ideas culminated in a political philosophy that blended Confucianism and Legalism—what I shall refer to as “Confuleg” for lack of a better term (in Chinese, 儒法, or more precisely儒表法里).

Confuleg went on to become the political philosophy that underpinned the key institutions of the Qin-Han Empire, the first to truly unify what is now China under a powerful and centralized state. In the ancient world, this was a towering achievement—socially, politically, and economically. In fact, the state model based on Confuleg was so successful that one could argue, to some extent, China still operates in its long shadow even today.  However, Confuleg’s ascent to hegemony in China was effectively a death sentence for the market of idea.

Since the Qin-Han empire, China has seen dynastic succession once a few hundred years, each usually accompanied by an extended period of turmoil, violence and destruction.

When China is ruled by a centralized state, the Republic of Letters cannot survive, as Mokry’s theory predicts.  Since the best employment opportunity for intellectuals could only be found in the state’s bureaucratic system, producing new knowledge or earning a reputation among peers no longer promises financial security.  Instead, survival requires pledging allegiance to the state (i.e., the emperor himself), internalizing the principles of Confleg as one’s own beliefs and values, and excelling the exams designed to test the ability to memorize and interpret classical texts.   More importantly, the state does not tolerate any competition with its monopoly over ideas.  Questioning the state-sanctioned ideology is viewed not only as heresy but as an act of treason, often carrying the gravest of consequences.

When the centralized state collapsed, one might expect that the ensuing chaos and factional warfare would create an environment favorable for a thriving market of ideas. After all, isn’t that exactly what happened during the Warring States period? Not quite.    The Chinese Canon maintained its powerful grip on intellectuals through these turbulent times. It even survived the brutal and repressive Mongol rule, which lasted nearly a century.  Why?

David Hume (1711 – 1776) observed that few Chinese after the classical period had courage to “dispute what had been universally received by their ancestors.”  John Stewart Mill (1806 – 1873) echoed this view, noting that Chinese tended to “govern their thoughts and conduct by the same maxims and rules,” and as a result (emphasis mine),

they have become stationary—have remained so for thousands of years; and if they are ever to be farther improved, it must be by foreigners.

Today, these words may sound condescending, if not outright discriminatory. However, I often wonder what China might be like today had Westerners never forced their way in. Would it still more or less resemble the world under the reign of Emperor Qianlong in the late 1700s?

Beyond conformity to the same maxims, most Chinese thinkers shared a peculiar, pessimistic nostalgia for a world once ideal and perfect but irretrievably lost. Mokyr identified this trait among the Neo-Confucians—the followers of Cheng Hao, Zhu Xi and Wang Yangming—who dominated the intellectual world during the Ming dynasty.  These scholars regarded antiquity, Mokyr wrote, as “the ideal period, followed by a decline, with no guarantee that the world would ever be better.”   However, the mindset did not originate form Neo-Confucians; it can be traced back to Confucius himself, who lamented the disintegration of the Western Zhou institutions he regarded as ideal, writing

Zhou observed the two preceding dynasties, flourishing with culture and refinement! I follow the Zhou. (周监于二代,郁郁乎文哉!吾从周)

It is hardly surprising that such an inherently backward-looking worldview would become an obstacle to new ideas.

What does Mokyr’s theory suggest about the future of innovation in human society?

The Republic of Letters that once thrived in pre-industrial Europe has long since disappeared, replaced by a vast scientific enterprise supported by a plethora of public and private institutions. However, some key principles from the old republic remain.

First, freedom of expression is still a foundational value. In universities, this is institutionalized through tenure, ensuring that professors’ livelihoods are protected from those who dislike their ideas. Second, a scholar’s value continues to be largely determined by their reputation among peers. This is why peer review, whether for publications or grants, remains the gold standard in academia, despite frequent criticisms of inefficiency, inconsistency, and unfairness.

The value of the science enterprise as an indispensable pillar of modern soceity is almost universally recognized today. Thanks to globalization, science has truly become a global affair: ideas, money, and scholars can now move freely across borders.   This all sounds uplifting until you realize where innovations are first made and adopted still matters a lot.  As Chris Miller explained in Chip War, leadership in science and technology has been the cornerstone of America’s national security strategy.   Until recently, the open science enterprise has served this strategy pretty well.

From pioneering semiconductors to exploring space, from mapping the human genome to advancing artificial general intelligence, the U.S. has consistently led the way. While much of this success can be attributed to America’s global hegemony, her strong commitment to the core values of liberal democracy—free speech, property rights, and limited government—must have also played a crucial role, according to Mokyr’s theory.

The meteoric rise of China apparently has shaken America’s faith in open science.  Reasonable people can disagree on the nature of the current Chinese regime; but few can claim with a straight face that Chinese citizens enjoy much political freedom, as the term is usually understood in the West.  The Chinese do not elect their leaders through open and free elections; their speech is tightly monitored and censored; and they are largely ruled by law, rather than being protected by the rule of law.  In theory, such an environment should be hostile to the market of ideas, hence innovations.

Yet, China has made remarkable strides in science and technology since the turn of the century. By 2025, China is projected to produce nearly twice as many STEM PhD graduates annually as the U.S.  In 2022, Nature, one of the most prestigious scientific publishers, reported that China had surpassed the U.S. to claim the top spot in the Nature Index for natural sciences. Additionally, a recent report indicated that by 2024, China was home to 369 unicorn startups (compared to about 700 in the U.S.), with nearly a quarter focused on AI and semiconductor sectors. Companies like Huawei have become such formidable tech giants that Chris Miller asks nervously in Chip War: “Could the United States allow a Chinese company like this to succeed?”

China’s rapid advances in science and technology raise a fascinating question that Mokyr’s theory seems unable to fully address: can innovation flourish and economic growth be sustained under an authoritarian regime like modern-day China?

If China were isolated from the global science enterprise, I would respond with a resounding “NO.” History has shown that when intellectuals are not allowed to freely speak their minds—as seems to be the case in China today—the market of ideas withers, dragging down opportunities for creative disruption and sustained economic growth.  However, could China simply grab the fruits produced by the global science enterprise, without ever having to maintain a thriving market of ideas of her own?  Could Chinese intellectuals and entrepreneurs continue to be creative and productive in advancing science and technology, even while their other rights, including freedom of speech, are severely impaired?

These are open questions.  However, the U.S., understandably anxious about her security, is not taking any chances. In recent years, she has taken drastic steps to restrict China’s access to cutting-edge technology and to limit interactions between the scientific communities of the two nations, particularly in areas with potential national security implications.   It is disheartening to see the leader of the free world openly retreating from a foundational principle of that world: that science should be freely accessible to all for progress and prosperity.   America’s China Initiative may also lend credibility to the popular narrative of Chinese nationalists that the so-called Westerns values are mere disguise for self-interests –– or worse, deep-seated racism against non-white people.

It is too early to determine whether America’s isolation measures will be effective, or even necessary, in curbing China’s ambition to lead global innovation in the coming century. What we can say with some certainty is that a less open science enterprise will be less vibrant and productive, and likely a less desirable place for scholars, especially those who are stuck between a rock and a hard place.  Politicians and strategists who support the China Initiative argue that this is a price worth paying to protect our freedoms and uphold the liberal world order. Only time will tell if they are right.

Marco Nie, Wilmette

September 22, 2024

The Invention of the Jewish People

Growing up in 1980s China, I vividly remember the smokes, bodies, and ruins on the TV screen, accompanied by unfamiliar phrases like “Gaza Strip” (加沙地带)and “West Bank” (约旦河西岸), uttered in Chinese by the hosts of CCTV’s famously boring evening news program.  After I came to the U.S., the similar scenes from Palestine continued to drive news cycles. However, this superficial familiarity did not mean I knew much about the conflict.  Why would I care? The dreadful stories from that land seemed to be getting old, and the people involved appeared hopelessly trapped in the past, while the world moved on.  October 7th and its aftermath changed me. Like many others, I struggled to make sense of the horror of that day, the ensuing humanitarian crisis in Gaza, and the widening chasm that suddenly threatened to engulf America. I have never seen so many protests on college campuses, including the one where I teach. More strikingly, the vast majority of protesters seemed to support the side that, from my understanding, had started the war in the worst way imaginable.

Perhaps they knew something I did not. Maybe my lack of historical knowledge clouded my judgment. In any case, I am determined to understand better. I decided to start with the history of the Jews, the chosen people who have claimed Palestine in the name of God.

“The Invention of the Jewish People,” written by Professor Shlomo Sand, an Israeli historian from the University of Tel Aviv, was the first book I stumbled on.  As usual, the book caught my attention because of its controversies. After finishing the book, however, I realized that the title might be unnecessarily provocative. “The Invention of the Modern Israeli State” would be more accurate, although inventing “the Jewish people” sounds much more exciting.

The book addresses a fundamental question: are Jews a people or a religion? To me, this was the most confusing aspect about the Jewish identity. I had always assumed Jews were simply believers of Judaism. Precisely because of their faith, Jews have been cursed, persecuted, and slaughtered for thousands of years by pagan Romans, Arabic Muslims, European Christians, and Nazis. This ancient hatred was so intense and enduring that an entire vocabulary of words was created to describe it: antisemitism, pogrom, Holocaust, ghettos, and genocide.

However, the mainstream view among Israelis today, according to Sand, is that Jews is also an ethnic group whose ancestry goes back to the apocryphal accounts in the Bible. To understand this claim, let me briefly recount Jewish history in its first millennium.

The legend has it that Abraham is the biological progenitor of all Jews.  God revealed the Truth to Abraham and made a covenant with him, promising that his descendants will become a great nation and be given the land of Canaan, which is today’s Palestine.   Abraham’s descendants were briefly enslaved by Egyptians but led by Moses back to the Hold Land, which they eventually conquered. By 1000 BCE, the Kingdom of Irael emerged, ruled sequentially by three legendary kings, Saul (the grandfather), David (the father) and Soloman (the son).   The kingdom reached its peak under Soloman, who built the First Temple, as well as grand palaces where he famously housed an enormous harem.  The death of Soloman in 930 BCE plunged Israel into a chaos from which it would not fully recover until perhaps modern times.  The nation was divided into Israel in the north and Judah in the south. Israel fell to the Neo-Assyrian Empire in 720 BCE, and Judah subsequently fell to Babylon in 586 BCE.

The conquest of Judah by Babylon was the first traumatic event—it destroyed Jerusalem and the First Temple, sending the first wave of Jews into exile in Babylon and Egypt. Many Jews would later return to the Holy Land and build the Second Temple with the help of the First Persian Empire, which by then had dominated the land between the Mediterranean and India. However, the Jewish nation would continue to exist as a vassal state—controlled first by the Persians and then by the Greeks—until 165 BCE, when the Hasmonean Kingdom gained independence following a revolt against the Greeks.

Then it was the Romans’ turn to ravage Palestine.  The Hasmonean Kingdom fell to Pompey’s legions in 63 BCE. The following two hundred years were marked by tremendous upheaval in Palestine, including the destruction of the Second Temple in 70 CE.  The conflict culminated in Bar Kokhab revolt (132 – 136 CE), during which Romans effectively decimated Judea and wiped out most of its Jewish population.   Jews began to leave Palestine in droves and would not return in great numbers until the 20th century.  Over the next two thousand years, the Jewish diaspora would spread out across the world, sometimes on their terms, but more often driven by relentless persecutions.

Crucially, the claim that Jews are an ethnic group implies that, no matter where they live now and how they have migrated through the ages, Jews have managed to maintain the purity of their ethnicity. The Jewish diaspora is seen as a nation in exile, unfairly deprived of its land promised by God, and thus has the right to return to it.  As Sand put it,

“National mythology determined that the Jews—banished, deported or fugitive emigrants—were driven into a long and dolorous exile, causing them to wander over lands and seas to the far corners of the earth until the advent of Zionism prompted them to turn around and return en masse to their orphaned homeland. This homeland had never belonged to the Arab conquerors, hence the claim of the people without a land to the land without a people.”

This image of a Jewish nation in exile, however, was a modern invention that only began to take shape in the second half of the 19th century.  According to Sand, up to that point, Jews still primarily identified themselves through their shared religion rather than a common ethnic lineage.   Yet, Bible would soon be “transferred from the shelf of theological tracts to the history section”, becoming an ethnic marker that indicates “a common origin for individuals of very different backgrounds and secular cultures yet all still hated for their religion”.  Armed with the Holy Book, now interpreted as a historical document, Jewish intellectuals in Germany, such as Heinrich Graetz and Heinrich von Treitschke, began to frame the history of Judaism “as the history of a nation that had been a kingdom, became a wandering people and ultimately turned around and went back to its birthplace”.

Sand’s theory is built upon three pillars.

First, Sand maintains that there was never a mass forced deportation of Jews from their homeland after the fall of the kingdom.  Instead, their emigration took place gradually over several centuries, due largely to the expropriation of Jewish land, first by Emperor Hadrian (one of the “Five Good Emperors” of Rome) following the Bar Kokhab revolt, and then by the new conquerors under the banner of Muhammad in the seventh century.  Moreover, Judaism teaches that the exile from the Holy Land serves as a form of redemption, and the return must await the End of Days, when the Messiah will arrive to resurrect the dead and offer salvation. Therefore, there was no “voluntary return” either, for that would be considered an attempt to “hasten the end and rebel against God’s spirit”.   As a result, Jews began migrating to Palestine in significant numbers, as Sand notes, “only when the American borders closed in the 1920s, and again after the horrendous Nazi massacres”.

Second, Sand argues that Judaism has not always been an exclusive religion of a chosen people. On the contrary, ancient Judaism was “as keen to propagate itself as Christianity and Islam would be in the future.” In fact, without significant proselytizing efforts that lasted more than 300 years, starting from the period of the Hasmonean Kingdom, the Jewish population could not have reached its current scale. The later exclusivity was more or less forced upon Judaism by Christianity’s more successful marketing strategy, as well as the edicts of Christianized Roman emperors, which forbade, among other things, the circumcision of males who were not born Jews, including slaves. Of course, this early period of mass conversion to Judaism –– whose memory has been deliberately eradicated by Zionists, according to Sand –– directly contradicts the notion of a pure ethnic lineage tracing back to Abraham and his sons.

The third and perhaps most controversial pillar is the so-called Khazar Hypothesis. According to this theory, a large part of Ashkenazi Jews—who lived mostly in Central and Eastern Europe before World War II, with a great many perishing in the death camps constructed by Hitler—were not migrants from Western Europe, especially Germany, but rather descendants of the Khazars,  a Turkic people who established a powerful empire in the region between the Caspian and Black Seas during the 7th to 10th centuries.  The existence of a Khazar Kingdom that converted to Judaism sometime between the 8th and 9th centuries has been accepted by many.  For example, another book I read recently, “A Short History of the Jewish People,” corroborates this. However, there seems to be no consensus on what happened to the Jews living in that kingdom after it was conquered by the Mongols.  Here is what Sand writes:

“The Mongols did not understand the needs of land cultivation in the vast territories they captured, and did not sufficiently care for the farming needs of the subjugated populations. During the conquest, the irrigation systems that branched from the wide rivers—systems that had sustained the cultivation of rice and vineyards—were demolished, causing the flight of masses of people and depopulating the prairies for hundreds of years. Among the emigrants were many Jewish Khazars who, together with their neighbors, advanced into the western Ukraine and hence to Polish and Lithuanian territories”.

To be sure, Sand lacks any substantial archaeological evidence to support his conjecture about a mass westward migration of Jewish Khazars. In fact, many scholars even doubt that a mass conversion of the Khazars to Judaism ever took place. However, he is right to ask why an interesting and plausible historical theory has been vilified in Israel as heretical, scandalous, disgraceful, and anti-Semitic since 1970s.

Whether or not Sand has all the facts correct—his book has faced pushback from fellow historians—his debunking of the modern concept of the “Jewish People” appears well-reasoned and persuasive, at least to a layman like me. The question is, why did Sand take it upon himself to debunk his own people? As a historian, he fully understood that the reconstruction of Jewish identity was part of the European nationalist movements of the time.

Nationalists have always looked to a glorified past to validate their distinct historical existence.

The Nazis imagined a mythical Germanic past where Aryans were portrayed as a superior, pure race destined to lead the world.

The Meiji Restoration in Japan revived Japan’s first emperor, Jimmu—who was purportedly a descendant of the sun goddess Amaterasu—to affirm the existence of a divine and unbroken imperial line.

As a kid, I was taught that all Chinese are 炎黄子孙, i.e., the direct descendants of the Yan Emperor (炎帝) and the Yellow Emperor (黄帝), who are said to have ruled China around 2700-2600 BCE. While I always understood these legends were to be taken with a grain of salt, I never doubted for a moment—until after living in the U.S. for some years—that the Chinese are an ethnic group with lineage tracing back to a glorious ancient people led by those legendary figures. Nor did I question the notion that the Chinese have a “sacred and inviolable right” to every inch of the “Chinese land”, including Tibet, Xinjiang, Mongolia, and Taiwan, even though these provinces were relatively recent conquests by the Manchurian, a nomadic people disparaged as foreign invaders as recently as in the early 1900s.

Jewish nationalists were no different: they found their narrative in Moses’  commandments  and Solomon’s mighty kingdom, despite the lack of evidence to prove they ever existed.

As Karl Deutsch quipped,

“A Nation … is a group of persons united by a common error about their ancestry and a common dislike of their neighbors.”

If deceptive self-aggrandizement is a standard practice among nation-states, why only calling BS on Jews?

Sand explains his rationale toward the end of the book. He argues that the Jewish identity, invented to justify and bolster the young state of Israel, is at odds with democracy, which requires that all people residing in a country be its sovereign. Since Israel is legally defined as a Jewish state, non-Jews living in the country are treated as undesirable aliens and are, to various degrees, segregated, excluded, and discriminated against. As a result, Israel remains an incomplete democracy or a low-grade democracy. In Sand’s view, this status quo is not only less than ideal but ultimately unsustainable, since

the myth of the Jewish ethnos as a self-isolating historical body that always barred, and must therefore go on barring, outsiders from joining it is harmful to the State of Israel, and may cause it to disintegrate from within.

Therefore, Sand calls for the “creation of a democratic binational state between the Mediterranean Sea and the Jordan River” as the ideal solution to the century-long Israel-Palestine conflict.   Be Ideal as it may, Sand’s one-state solution is a non-starter for either side of the conflict nowadays.  Indeed, even he concedes that such a solution, which would likely condemn Jewish Israelis to a permanent minority status in their own state, might be asking too much.   Given their historical experiences over the past two thousand years, it is understandable that Jews are wary of being a minority, especially in a country where the majority adheres to a different faith and may consider itself permanently at war with non-believers and apostates. That leaves us with the two-state solution, which remains viable but barely so.

In all likelihood, the endless cycles of violence and truce will continue, as Israelis and Palestinians remain locked in a perpetual life-and-death struggle over a piece of real estate that they could have shared in peace and prosperity. Sand mentioned that many Palestinians may, in fact, be descendants of Jews who voluntarily converted to Islam after the Arab conquest. If that is true, I cannot think of a more poignant example of the perverse power of religion, which has turned the same people against each other in such a tragic and horrific manner.

 

Marco Nie, Wilmette, IL

Theory of Moral Sentiments

A Theory of Moral Sentiments is Adam Smith’s first book. Compared to Wealth of Nations, his magnum opus, this book was less well-known.  Steve Dubner discussed it extensively in a Freakonomics series, which argued that Smith has been misread by modern economists like Milton Friedman, and that the real Adam Smith was in fact an “affable moral philosopher”, rather than “the patron saint of cutthroat capitalism”. The podcast piqued my interest in Adam Smith and his theory of moral sentiments. The book was not an easy read for me, as it took some time to adjust to the 18th-century writing style.   However, I think the time was well spent.

Central to Smith’s theory is the proposition that the perception of right and wrong comes from sense and feeling rather than reason.  Human happiness, according to Smith, chiefly “arises from the consciousness of being beloved”.  Because we desire to be loved by our brethren— taken to mean relatives, friends, neighbors, and countrymen—we seek their approval and avoid their disapprobation. It is through this pursuit of love and happiness humans acquire sympathy, the ability to share and approve the feelings or interests of another person.  However, to truly sympathize with another’s feelings—to empathize with them (although Smith never used this term)—we must first overcome our own selfishness.

To make this crucial point, Smith proposes a thought experiment, which imagines how “a man of humanity in Europe” would react to the news that a huge earthquake has suddenly destroyed China and all its people. He would, Smith wrote,

“express very strongly his sorrow for the misfortune of that unhappy people, he would make many melancholy reflections upon the precariousness of human life, and the vanity of all the labours of man, which could thus be annihilated in a moment. He would too, perhaps, if he was a man of speculation, enter into many reasonings concerning the effects which this disaster might produce upon the commerce of Europe, and the trade and business of the world in general.”

However, after “all this fine philosophy was over”, the man would return to his regular life as if nothing had happened. Indeed, an accident of trivial scale—compared to that catastrophe in China—befallen on him, say the loss of a little finger, would cause him to lose more sleep than what he would over “the destruction of that immense multitude”. If this is so, Smith asks, would this person be willing to sacrifice the lives of all those Chinese to prevent that “paltry misfortune to himself”?   Smith claims humankind has never produced a villain that could be capable of entertaining such a horrific thought. On this point I disagree with him, though his faith in humanity is understandable. After all, Smith has never witnessed the world wars, heard of Holocaust, or met the infamous dictators of the 20th century.

Smith claims what prevents most people from placing their own interests above the greater interests of others is an impartial spectator that grows and resides within them.  The impartial spectator is “the great judge and arbiter of our conduct”, who teaches us that

“we are but one of the multitude, in no respect better than any other in it; and that when we prefer ourselves so shamefully and so blindly to others, we become the proper objects of resentment, abhorrence, and execration. It is from him only that we learn the real littleness of ourselves, and of whatever relates to ourselves, and the natural misrepresentations of self-love”.

Thus, to become a moral person is to forge and train this impartial spectator, and to be guided by him.  There is a subtle but crucial difference between a moral person and a virtuous one: the former merely follows the impartial spectator’s rules, whereas the latter adopts and embodies his moral sentiments. In some sense, the virtuous person becomes a proxy of the spectator, unified with him in both spirit and conduct, thereby entering a state of spiritual freedom, at which the bounds of moral constraints are no longer felt.

Impartiality is central to many theories of morality. For example, John Rawls’ “veil of ignorance” serves as an instrument of impartiality in his theory of justice. Smith’s impartial spectator also resembles what a Confucianist would call “inner sage” (内圣), or the “innate moral knowledge” (良知) in Wang Yangming’s Theory of Mind (心学).  The unifying state achieved by a virtuous person, I believe, is “知行合一” in the Theory of Mind, and the process through which to arrive at that state is called “致良知”.  Like Smith, Wang also emphasizes sympathy as the approach to morality.  In Instruction for Practical Living (传习录), he writes,

“世之君子惟务致其良知,则自能公是非,同好恶,视人犹己,视国犹家,而以天地万物为一体。”

Thus, with the help of the impartial spectator (良知), the virtuous person (君子) can be just (公是非) and have empathy (同好恶,视人犹己).

Smith believes moral norms first emerge to forbid actions that inflict pains on a person, such as endangering their life and body, depriving their possessions and property, and violating their rights to basic liberty.  This is because humans are disposed to sympathize with sorrow more strongly than with joy.  Moral norms are extremely important, as they form the laws of justice, without which human society cannot survive.  Yet, the sense of justice only enables people to behave with minimum propriety and decency.  To Smith, it is a mere “negative virtue” that does no real positive good.

Throughout much of the book, Smith explains the transition from adhering to basic moral norms to cultivating positive virtues. The mechanism is still sympathizing, and the secret is to overcome the less desirable aspects of human nature.

What makes us jealous of the success or good fortune of another person?  Again, the reason is that humans are generally more focused on avoiding pain than seeking happiness. As a result, it is more difficult for us to will the good of our brethren—i.e., to truly love them—than to avoid harm to their person and property.  The sentiment of envy is strongest when the person is regarded as an upstart.  As Smith notes,

“The man who, by some sudden revolution of fortune, is lifted up all at once into a condition of life, greatly above what he had formerly lived in, may be assured that the congratulations of his best friends are not all of them perfectly sincere.”

However, thanks to the impartial spectator, we are also ashamed of our own envy, and “often pretend, and sometimes really wish to sympathize with the joy of others”.  A man who fought and won this battle with our sentiment of envy is capable of that magnanimous act of willing the good of our brethren, loving them as much as we love ourselves. He may also learn to maintain prudence and humility no matter what stellar successes he has just achieved and how much he thinks he is entitled to boast about them.  Sympathy reminds him that, by overly displaying joy in his achievements, he could arouse among his brethren envy and jealousy, and the shame and self-pity that come with it.  Therefore, he always “endeavors, as much as he can, to smother his joy, and keep down that elevation of mind with which his new circumstances naturally inspire him.”

Smith was not utilitarian, despite being revered as the father of economics—which is built on the notion of utility-maximizing homo economicus—and invested as a god of capitalism.   As the book makes abundantly clear, Smith did not endorse, much less celebrate, cold-blooded self-interest. His famous “invisible hand” explains why society can work well despite, not because of its members being utterly self-interested.  Surprisingly, he made the same point in this book, which was first published in 1759, seventeen years before the Wealth of Nations. He writes that the rich,

“though they mean only their own conveniency… are led by an invisible hand to make nearly the same distribution of the necessaries of life, which would have been made, had the earth been divided into equal portions among all its inhabitants, and thus without intending it, without knowing it, advance the interest of the society, and afford means to the multiplication of the species.”

If Smith believes that self-interest can be guided toward positive outcomes by the invisible hand, he clearly opposes such consequentialism in matters of morality. He was deeply troubled by the fact that “the world judges by the event, and not by the design”, which he called “the great discouragement of virtue” throughout the ages. Smith conceded that, in the realm of justice, punishment should be proportional to the consequences of our actions, rather than our intentions. However, he forcefully argues that the opposite should apply when assessing our own character and conduct.

In this regard, Smith is nearly a moral idealist. He believes we should strive for “exact propriety and perfection” rather than settle for the lower level “which is commonly attained” by most people. Smith argues that focusing on the inferior standard is what led many historical figures to become arrogant, presumptuous, and extravagantly self-admiring.  Self-admiration may be necessary for their success, as it drives the great men to pursue ventures that a more cautious mind would never consider.  “When crowned with success”, however, this presumption “has often betrayed them into a vanity that approached almost insanity and folly”, and “precipitated them into many rash and sometimes ruinous adventures”.  Somehow, Elon Musk’s face crossed my mind when I read the above passage.

Since to be loved by others generally means to receive their attention and praise, a great deal of human energy has been consumed by the struggle to stand out and be recognized.  Smith refers to this desire for attention and praise as “vanity”.  Although vanity is not inherently a vice, it becomes problematic when it is directed towards the wrong objects. Therefore, writes Smith,

“the great secret of education is to direct vanity to proper objects”.

Because a man sees wealth and power attract attention and submission, he is often compelled to pursue them. Similarly, observing that fame and glory earn respect and praise, he aspires to be famous and honored. Consequently, he mistakenly equates these pursuits with achieving love and happiness. Smith tells us that

“nature has endowed a man, not only with a desire of being approved of, but with a desire of being what ought to be approved of.”

Wealth, power, fame, and glory all signal approval from others, but not necessarily “what ought to be approved of”. To Smith, pursuing praise and pursuing what is praiseworthy are distinctly different. The former often leads us to chase misguided objects of vanity, while the latter inspires a genuine love of virtue.   A virtuous man derives little pleasure from praise where it is not due; instead, he often feels the greatest satisfaction in doing what is praiseworthy, even though “no praise is ever to be bestowed upon it”. Thus, “to be that thing which deserves approbation” is “an object of the highest” to him. If succeeded in this endeavor, he no longer needs approval from others.  He would become assured of “the perfect propriety of every part of his own conduct” and be content with his self-approbation, which, according to Smith, is virtue itself, the only thing which he can and should care about.

Smith’s emphasis on praise-worthiness rather than praise, and on self-approbation rather than approval by others, appears to be rooted in Stoicism.  Smith writes that the Stoics believes

 “human life…ought to be regarded but as a mere two-penny stake. …Our only anxious concern ought to be, not about the stake, but about the proper method of playing. If we placed our happiness in winning the stake, we placed it in what depended upon causes beyond our power, and out of our direction. We necessarily exposed ourselves to perpetual fear and uneasiness, and frequently to grievous and mortifying disappointments. If we placed it in playing well, in playing fairly, in playing wisely and skillfully; in the propriety of our own conduct in short; we placed it in what, by proper discipline, education, and attention, might be altogether in our own power, and under our own direction. Our happiness was perfectly secure, and beyond the reach of fortune.”

In a nutshell, to shield our happiness from the whims of fortune, we should remain as indifferent as possible to praise, recognition, and all the superficial allurements of vanity. This philosophy aligns with a precept I learned many years ago from a Chinese author: 但行好事,莫问前程 (Focus on doing the right thing, rather than on achieving the perfect outcome).  It also echoes my favorite quote from Daniel McFadden’s Nobel Prize autobiography (the emphasis is mine):

“My parents taught me that to lead a virtuous life, I should be modest, take my satisfaction from work done well, and avoid being drawn into competition for status and rewards.”

This idea is precisely what I have been trying to tell any of my doctoral students who would listen: To truly enjoy academia, you must find joy in the research itself, independent of any external rewards it might bring, whether that’s funding, awards, or even the opportunity to change the world.

Marco Nie

April 14, 2024, Evanston, IL.

Chip War

Chris Miller masterfully told the story of the spectacular rise of the semiconductor industry (the “chip”) and its ever-growing entanglement with geopolitics (the “war”).  It’s a fascinating narrative, filled with ups and downs, twists and turns, heroes and villains, and a cast of victors and losers — well worth reading for its own sake .  It is a must read if you want to understand the current U.S.-China relationship and the slow-moving crisis hanging over the Taiwan Strait.  Semiconductors have become central to the U.S-China relationship, with one side aggressively playing catch-up, and the other striving to maintain its waning lead.  Taiwan has the misfortune to be caught in the middle of this seemingly inevitable epic-clash, not so much because it offers a beacon of hope for “the free world”, as because it houses Taiwan Semiconductor Manufacturing Company (TSMC), the sole fabricator of the world’s most sophisticated chips.

As I read about the legends of semiconductors unfolding in the book, I came to realize my own ignorance about an industry that has profoundly transformed humanity.

I did not know Williman Shockley who, along with two other scientists at Bell Labs, discovered semiconductors and invented transistors. He also started a company called Shockley Semiconductors Laboratory that counted Gordon Moore (yes, that’s the Moore after which Moore’s law is named) and Robert Noyce among its first hires. The pair would rebel against Shockley later and go on to become the giants of the burgeoning industry. They first founded Fairchild Semiconductor that supplied the computing power to land men on the moon in the 1960s, and then Integrated Electronic, or Intel – a household name in today’s tech world.

I had never heard of Texas Instruments (TI) before I read the book.  But among TI’s earlier employees are Jack Kilby, who won a Nobel prize in physics in 2000 for inventing the integrated circuit (集成电路), Jay Lathrop, who created the first photolithograph (光刻) scanner, and Morris Chang, an immigrant from Mainland China and the founder of TSMC.

Nor could I distinguish between memory chips and logic chips, PC chips and smartphone chips, or deep ultraviolet (DUV) lithography and extreme ultraviolet (EUV) lithography.  What has struck me the most, however, is the incredible difficulty to keep up with Moore’s law, which posits that the number of transistors on a microchip doubles approximately every two years. Indeed, the cutting-edge chips have become so complex that TSMC is the only manufacturer in the world capable of fabricating them at scale.  TSMC does this with “ultra-pure silicon wafers and specialized gases from Japan” and machinery that “can etch, deposit, and measure layers of materials a few atoms thick”. Supplied by only five companies, these tools themselves took decades and an astronomical amount of money to develop, and their core technologies are closely guarded trade secrets.  Take the development of the EUV lithography for example. The project was launched in the early 1990s thanks to a $300-million investment from Intel.  However, it wasn’t until nearly 30 years and billions of dollars in spending later that the Dutch manufacturer ASML finally introduced EUV scanners to the market in 2018, at a price of 100 million apiece for an expected lifetime of four years. For a layman like me, it is mind-boggling to read just how the scanner produces enough EUV light needed for fabrication:

The best approach was to shoot a tiny ball of tin measuring thirty-millionths of a meter wide moving through a vacuum at a speed of around two hundred miles per hour. The tin is then struck twice with a laser, the first pulse to warm it up, the second to blast it into a plasma with a temperature around half a million degrees, many times hotter than the surface of the sun. This process of blasting tin is then repeated fifty thousand times per second to produce EUV light in the quantities necessary to fabricate chips.

It does sound like a miracle, as Miller put it, that something this delicate not only works, but “does so reliably enough to produce chips” that can make lots of money.

This sums up the history of chips. What about war?  The book describes three chip wars that took place between the U.S. and her rivals in different eras.

The war with the Soviet Union, fought mostly in the first half of the cold war, was won with relative ease. The USSR treated its semiconductor industry as a weapons program, similar to its treatment of nuclear and space technology.  In hindsight, this strategy was a huge mistake, as the immensely profitable civilian applications of semiconductors turned out to be such a strong driving force for innovations that no level of government spending could hope to rival.  Faced with the lack of progress, the Russians tried to copy the U.S. technology through espionage. Yet, this did not work either.  For one, even the most skilled spies cannot steal all the technical know-how involved in complex production processes. More crucially, the “copycat” mindset inevitably condemned Russians to a perpetual game of catch-up, rather than allowing them to lead the way.

Japan was a much greater threat.  Thanks to favorable technology transfer and trade policies that the U.S. willingly offered in exchange for the Japanese support of America’s global order, Japan’s semiconductor industry evolved from a niche player specializing in consumer electronics in the 1960s and 1970s to a formidable powerhouse in the 1980s. By 1985, Japan had begun to outspend the U.S. in capital investment for semiconductors, and by the end of that decade, it had become the dominant supplier of the world’s Dynamic Random-Access Memory (DRAM) chips (90% market share) and lithography equipment (70%). Trade disputes soon ensued.  The skirmish started with the U.S. accusing Japan of espionage, double-dealing, and dumping.  It escalated to the point where the U.S. openly threatened tariffs, ultimately compelling Japan to impose quotas on its exports of DRAM chips to the U.S. in 1986.  This did not help Silicon Valley recover their lost ground, however.  Eventually, nearly all American companies, including Intel, were driven out of the DRAM and lithography markets.

Carried away by their astonishing success, the Japanese began to dream about, in the words of Sony Founder Akio Morita, overcoming the United States economically and becoming “number one in the world”.  The U.S. was understandably frightened by the pent-up nationalism revealed in The Japan That Can Say No—which Morita co-authored—and the gloomy prospect of relying on a foreign country to maintain the most important edge of her military. In response, the U.S. launched a campaign to curtail Japan’s dominance in chip-making industry.  The core strategy involves mobilizing South Korea (Samsung), Taiwan (TSMC), and to a lesser extent, Mainland China, to erode Japan’s competitive advantages by enabling cut-throat competition against her companies.  It worked like magic.  In 1998 Japan’s share in the DRAM market fell to 20% from a near monopoly less than a decade ago, while South Korea dethroned Japan to become the largest producer of memory chips. Not only did Japanese firms suffer tremendous share loss in the DRAM market, but they also missed the emerging opportunities in the nascent PC market.  In what Miller dubbed as one of the greatest comebacks in industry history, Intel, under Andy Grove’s leadership, reinvented itself as the king of microprocessors for PCs.  For what seemed like an eternity in this fast-paced industry, Intel was literally the icon of the PC industry, as the blue trademark of its processors emerged as the most recognizable feature on most PC sold globally. Indeed, I remember the first PC I ever owned— which my five college roommates and I purchased in 1995 using pooled funds—simply as a 486, because it was powered by Intel’s 486 microprocessor.  According to Miller, that very computer chip was the first ever with over a million transistors!

This brings me to the latest, still on-going chip war with China.  On the surface, the plot of the Chinese edition bears resemblance to that of Japan: the wary incumbent hegemon, spooked by the rapid ascent of an upstart, is compelled into massive counteractions to neutralize the threat, real or imagined.   However, unlike Japan, China has never really overtaken the U.S. in any high-end technology areas of the semiconductor industry.  Not even close.  According to Miller, toward the end of the 2010s, China had less than 1% of the global chip design software tool market, about 2% of core intellectual property related to “the building blocks of transistor patterns”, 4% of silicon wafers, 1% of fabrication machinery, 5% of chip design, and 7% of fabrication concentrated in the non-cutting-edge chips.  If that is the case, has the U.S. overreacted with her heavy-handed sanctions and embargo against China’s tech sector?

Regarding this, Miller’s insights on the crackdown of Huawei were particularly enlightening. He acknowledged that the charges against Huawei, which included theft of intellectual property, ties with the Chinese military, and violation of U.S. sanctions on Iran, were “ultimately a sideshow” – basically a euphemism for made-up excuses.  The real issue was, Miller wrote,

That a company in the People’s Republic of China had marched up the technology ladder… Its annual R&D spending now rivaled American tech giants…, it was the most successful exporter [of all Chinese tech companies], giving it detailed knowledge of foreign markets. It not only produced hardware for cell towers, [but] it also designed cutting-edge smartphone chips. It had become TSMC’s second biggest customer, behind only Apple.

Therefore, the real question was: “Could the United States let a Chinese company like this succeed?” That is a rhetorical question in case you did not catch the drift. But why?

I can think of several reasons.  

First, unlike Japan, China was not a liberal democracy.  Judged by what was going on in the country since the early 2010s, China absolutely has no interest in becoming one anytime soon. To make things worse, under the current leader, China has repeatedly asserted that perhaps her system, rather than America’s, should be the model that the rest of the world admires, envies, and emulates.  Even when Morita lectured Americans about the superiority of the Japan system, it was seen in Washington as a serious provocation – and he wasn’t even talking about authoritarianism.

Second, unlike Japan, China has never pledged allegiance to the America-led world order.  In fact, in the past decade, China has decisively shifted from biding her time as a halfhearted participant of that order to openly flirting with the idea of challenging it, economically, technologically, and if necessary, militarily.

Third, China has increasingly embraced nationalism as a rallying cry for its people to coalesce around the current regime. However, the inherent logic of this political agenda requires the “unification” of the motherland, regardless of the cost. Whether this stance is a concrete national policy or merely a slogan to appease the little pinks on the internet remains to be seen. Yet it does place China on a collision course with Taiwan and the U.S.  When push comes to shove, the U.S. could find herself in a treacherous standoff with what she now regards as a “peer competitor”. The stakes are incredibly high. Retreating from the American commitment to Taiwan’s security would spell the end of the current global order, potentially plunging the world into chaos.   More importantly, losing Taiwan could hand China a golden opportunity to erode the America’s technological supremacy, which has been a cornerstone of her national security since at least World War II.

As of this writing, China has been denied access not only to high-end chip-making technology but also to the high-end chips themselves. Lacking essential tools (e.g., EUV scanners) and raw materials (e.g., pure silicon wafers), China’s semiconductor industry, as well as her tech sector in general,  is likely to fall behind. Indeed, it has already missed out on the latest gold rush in AI, particularly the triumph of large language models, partly because her access to computing power (GPUs) was severely restricted by sanctions.

Could China break this “neck-strangling” (卡脖子) situation entirely through its own initiatives? Before reading this book, I thought there must be a way, if the nation threw its entire weight behind the challenge.  I am much more pessimistic now. If there’s one thing I’ve learned from the book, it’s that the creation of cutting-edge chips can no longer be achieved within the borders of a single country, not even the U.S. Moreover, the pursuit of technological innovations as a nationalistic project may not be a sound strategy for long-term success, as demonstrated by the failure of the USSR.

Could the chip war have been averted had China chosen a different path in the early 2010s?  What impact will the current conflict have on the “China Dream” and the lives of 1.4 billion Chinese? No one knows the answer.  One can only hope that the war remains confined to the realm of chips and continues to be fought by scientists and engineers with their computers, rather than soldiers with guns and missiles.

 

Marco Nie

Wilmette, IL

3/3/2024

A letter to Henry Ford

As part of her social study homework, my daughter, Jolene,  wrote a letter to Henry Ford, imagining it from the perspective of Greta Thunberg.  I like the letter but thought it did not sound much like Greta.  Then it occurred to me that I could ask ChatGPT to rewrite it in Greta Thunberg’s style .    Intrigued by the idea, Jolene enthusiastically consented to this experiment. She has also given her permission for both the original letter and ChatGPT’s adaptation to be shared here.  I hope you have as much fun as I had reading these!


Jolene’s original letter

Herr Henry Ford,

Isn’t it interesting how one day can change your life, and you wouldn’t know? I remember that day like it was yesterday. I remember sitting down at my desk at school, not knowing that my story was just about to start. The teacher announced that we would be watching a film about the state of our environment. I remember how as it progressed, my feelings grew from curiosity to horror as pictures of endless mountains of garbage floating in the ocean flitted across the screen. I will be the first to admit it: I cried. I remember every vivid detail, though my classmates all forgot the atrocities they just witnessed mere minutes after the film ended, apparently deciding that discussing their weekend plans was more important than the outrageous amounts of garbage and smoke that pollute our planet. It’s a strange feeling, looking back and knowing that this film would be forever engraved in my mind. I’m sure you would feel the same about the assembly line, looking back at when you invented it, when you didn’t know you would revolutionize the automobile forever. I suppose that means I shouldn’t blame you for the polluting smog cars let out into the environment every day. Yet, as I sit here in my room, sun warming the paper I write on, I can hear the cars passing my home, and I can picture the cloud of smoke that lingers behind. Too much has been done to our Earth for me not to take a stand.

I understand that when you were innovating and popularizing the automobile, your intentions must have been good: to make people’s lives easier and better. If that was your goal, then you achieved it. At least, at the time you achieved it. On the surface level, it seemed perfect: more jobs were created that more workers were qualified to do, which made our country’s economy stronger. Cities became less crowded as people moved to neighboring towns to form suburbs since they could drive to work. It bridged a gap between different classes of people as it became a norm that men and women, rich and poor alike drove together on the roads. But at what cost?

In 2024, the global temperature has increased by about 1ºC since 1900, with the majority happening after 1970. As a result, the ocean has warmed as well, causing ice caps to melt and the sea level to rise. A main cause of this is the burning of greenhouse gasses. Only approximately 8% of Americans do not own a car. A glance out the window during a busy time of day will show streets packed with everything from fancy red convertibles to rusty pickup trucks. I have worked to decrease the damage our race causes to the Earth, yet it still rises. People are all for helping to stop climate change until it actually inconveniences them. Few are willing to give up constantly driving their car instead of walking, biking, or using public transportation, or to fly across the world a few less times. While in my mind I realize that not all of this is your fault, my heart protests against it, aching for the state of our world. Neither of us can help it if your life’s work happens to clash against mine.

All of this goes to say that, while I respect how hard you worked to make the automobile what it is today, I will work twice as hard to ensure that the damage is undone. I will not apologize for that.

Greta Thunberg


ChatGPT’s Version

Dear Mr. Henry Ford,

The day everything changed for me was just another day at school, until it wasn’t. We watched a film about our planet’s environmental crisis, and the images of pollution and destruction struck me to the core. It was a moment of awakening — a stark, horrifying realization that we are in the midst of a grave crisis. I remember the despair I felt, a sharp contrast to the indifference of my peers. Such moments define us, much like the day you pioneered the assembly line must have defined you, unknowingly setting off a chain of events that would lead us here.

Your innovation brought mobility and progress, but at a catastrophic environmental cost. We now face a planet warmed by over 1°C since 1900, mainly due to emissions post-1970. The burning of fossil fuels, a legacy of the automotive boom you ignited, has pushed our planet to the brink. Our ice caps are melting, our seas are rising, and our future is at stake.

I understand that the environmental crisis we face today cannot be solely laid at your feet. However, the industry you created has played a pivotal role in this unfolding tragedy. As I witness the relentless march of cars outside my window, each contributing to a cloud of pollution, I am reminded of the urgency of our situation.

Mr. Ford, while your intentions may have been to better the human condition, the reality is that we are now paying a heavy price for this ‘progress.’ I respect your ingenuity, but I cannot overlook the environmental devastation it has wrought. I am committed, more than ever, to fight for a sustainable future, to challenge the status quo, and to reverse the damage we have done to our Earth. This is not just a promise; it is an imperative.

We are at a crossroads, and I will not rest until we choose the path of sustainability, responsibility, and survival. I will advocate, protest, and work harder than ever to ensure a future where technology serves our planet, not destroys it. This is the mission of my generation, and I accept it wholeheartedly.

Yours in determination,
Greta Thunberg

Crusaders

Dan Jones is a great chronicler.  He knows how to turn dry events into vivid stories, which characters to focus on so his narrative always has a humanly anchor point, and when to make witty quips without being seen as overly opinionated.   Some writers have the talent to captivate their audience with no more than the charm of their language – I think Jones is one of them.

“Crusaders” covers nearly four centuries of medieval history, from the end of the eleventh century CE, when Pope Urban began to preach the Holy War against the infidels in the east, to the conquest of Jerusalem by the Ottman empire in 1517.   Officially, crusading met its calamitous end in 1291, when Acre, the last stronghold of the Christian kingdoms in the east, fell to the hands of Mamluks.   However, as a phenomenon, crusading continued until Columbus’s discovery of America––which was “full of things to trade or steal, and teeming with people to subjugate, convert or kill”—convinced Western Christendom that its future “lay to the west, not the east”.

Out from this eventful and bloody chapter of human history stand a few prominent and complicated characters that I think deserve some ink even in a brief book review.

Richard the Lionheart, the legendary king of England who spent most of his adult life in France, was the commander in chief in the Third Crusade.   Rumored to be a gay, Richard was famed for his martial prowess, courage and generosity. He also was a man of letters who loved lyric poetry and music and courted poets of High Middle Ages.  Under Richard’s leadership, crusaders retook Acre and delivered a string of humiliating blows to the army of the mighty sultan Saladin of Ayyubid Dynasty, but ultimately fell short of seizing Jerusalem itself.  The struggle ended with a negotiated truce that placed the coastal towns between Jaffa and Acre under the Christian rule, while allowing Christian pilgrims and merchants to access the Holy City.  Although the settlement helped stabilize the Kingdom of Jerusalem for decades to come, it forever transformed crusading from a religious imperative into an enterprise of colonization.

Like many powerful men of his age, Richard was often reprimanded in history books for being lustful, greedy, and cruel.  I suspect some of Richard’s vices were exaggerated by the clergymen who resented him for being forced to pay for his military adventures.  That said, the extent of Richard’s cruelty is indisputable.  The most notorious episode was the execution of 2600 unarmed and bound prisoners of war at Acre, as a retaliation against Saladin’s failure to fulfill his promise to “return the relic of the True Cross and pay his bounty”.   Be technically legal as it may, noted Jones, this despicable act of cruelty was “excessive even by the standards of the day”.  Little wonder Richard’s name has acquired such an infamy in the Muslim world that it was often invoked by impatient moms to calm their unruly children.

Enrico Dandolo, the doge of Venice, was the hero––or the villain, depending on who you ask––in the Fourth Crusade.  He took the cross at an incredibly advanced age of 95, having gambled his country on a military alliance according to which Venice would equip and supply the Fourth Crusade in exchange for 85,000 silver marks.  When Dandolo realized his airheaded partners could not pay their dues, he decided to save Venice from bankruptcy by what essentially amounted to organized robbery.   His first target was the city of Zara, a possession of King Emeric of Hungary who was not only a pious Chrisitan but also a fellow crusader.  Zara’s sacking infuriated Pope Innocent III as he had explicitly forbidden it.  As a result, all Venetian crusaders were “excommunicated”, i.e., officially expelled from the Catholic Church.  Dandolo couldn’t care less. He soon seized another opportunity that promised even more money, by injecting the crusaders into a conspiracy aimed at dethroning the Byzantine emperor.  There is no space to recall the entire drama – suffice to say that it led to the siege and fall of Constantinople in 1204.  Once again, Dandolo’s allies failed to hold their side of the bargain, so it seemed as if he almost had no choice but to help himself with what was promised to him.  For three days, the crusaders vandalized the richest city in the West.  The estimated total value of the loot amassed during their plundering is believed to be around 900,000 silver marks.  If this figure is accurate, then Venice’s investment in the Fourth Crusade yielded a staggering tenfold return.   Dandolo thus exemplified the notion of prospering by doing God’s bidding – a modern entrepreneur from Silicon Valley would recognize this as the medieval version of “doing well by doing good”.

At the time, many ancient and medieval Roman and Greek works were stolen and sent back to Venice. The most notable were the four bronze horse statues from the Hippodrome, believed to have been crafted in the second or third century CE.    When I visited Venice in the summer of 2023, a replica of these magnificent statues was indeed, as Jones teased, “still proudly displayed at Saint Mark’s Basilica.”  Our Venetian tour guide was careful not to dishonor what is considered a national treasure in her country. The horses, she told us, were “brought back” from Constantinople 800 years ago.

Dandolo died a year after the fall of Constantinople. He was 98 and had been visually impaired for more than three decades.  The crusaders understandably cheered what they had accomplished under the command of the aged and fragile man as a miracle.  To many a Christian, however, the brutal sacking of Constantinople was a dark and scandalous chapter in the history of their faith.   The cruel irony—a mission sanctioned by the Catholic papacy resulting in the destruction of the spiritual capital of the Eastern Orthodoxy—was simply beyond the pale.  Jones summarizes Dandolo’s controversial involvement in the crusade aptly:

“He had bravely defied his physical disability and his decrepitude, and his pragmatic leadership and dauntless personal valor were beyond question. Yet in the end Dandolo had turned his talents to a wholly disreputable end, playing a leading part in a dreadful episode that, even by the cruel standards of the crusading era, thoroughly deserved the epithet leveled against it by Choniatēs: “Outrageous.”

Another fascinating historical figure from this era is the leader of the Sixth Crusade, Frederick II, the emperor of the Holy Roman Empire.   His famed grandfather, Frederick I “Barbarossa”, drowned while attempting to cross a river during the Third Crusade.  About 750 years later, Adolf Hitler, in a seemingly ironic twist, named his ill-fated Russian campaign after the elder Frederick.  However, Frederick II succeeded where his progenitor faltered. Through an agreement reached with the Ayyubid sultan Al-Kamil, he regained control of Jerusalem in 1229, a feat that three costly crusades had failed to accomplish in four decades.  To be sure, Frederick II enjoyed good fortune, as Ayyubids were distracted by potential conflicts with its Muslim brethren in Syrian and Mesopotamia. However, there is no question that the emperor’s intelligence, personality, pollical acumen and breadth of knowledge also played a crucial role. Fredreick II was, in the words of Jones, “a shockingly liberal intellectual and a bluntly pragmatic ruler”.    He spoke six languages, including Arabic and Greek, boasting a reputation as a polymath.

Fredreick was a man with an insatiable curiosity about the natural world that extended far beyond the tenets of Christian Scripture. He loved natural sciences, astrology, logic, rhetoric, medicine, law, philosophy and mathematics…(and) surrounded himself with Latin, Greek, Muslim and Jewish tutors, advisers, poets, scholars and bureaucrats. Well into adulthood, he retained a personal Arab tutor in logic, and he corresponded with Jewish and Arab scholars in southern Spain.

In short, Frederick was a philosopher king in the Platonic ideal, reminiscent of figures like Marcus Aurelius of the Roman Empire and Kangxi of the Qing Dynasty in China.

Paradoxically, the “greatest and least bloody crusading victory” won by Fredreick was met with universal condemnation rather than exaltation among his fellow crusaders.  When the emperor left Acre, it was reported, he was “hated, cursed, and vilified”. Why? Ostensibly, the reason was that his participation in the Six Crusade was technically illegal because he had been excommunicated by the pope for allegedly failing to honor his previous crusading pledge.  However, his quarrels with the papacy ran deep and deteriorated following his triumph in the east.  Eventually the most successful crusader of his time would become himself the target of a crusade officially endorsed by the Catholic church.  Although Fredreick “could be infuriating, overbearing and self-serving”, concluded Jones, it is still difficult to “conceive of a greater perversion of the institutions and language of crusade than for such a war to be preached against” him.

Beneath the veneer of glory surrounding these crusading kings and generals lay unspeakable violence, horrific human suffering, and ferocious atrocities.  After all, as Jones noted, “there was precious little time for thoughts of human rights on either side” of the crusading divide.

When Baldwin II of Kingdom of Jerusalem laid siege to Aleppo in 1124––toward the end of his futile effort to break into the Syria interior—his army reportedly engaged in “elaborate rituals of depravity” against the Muslim residents.  According to Jones, the crusaders

“raided Muslim funeral chapels, took coffins to repurpose as storage chests for their camp, then goaded the citizens with the sight of their dead relatives’ corpses being grotesquely desecrated…Whenever the Franks captured an Aleppan Muslim, they cut off his hands and testicles.”

During the Fifth Crusade, Damietta, the third-largest city in Egypt, endured a siege lasting a year and a half.  Even the battle-hardened crusaders were apparently horrified by what they saw in the once-thriving city. It had transformed into a ‘fetid, disease-ridden graveyard, inhabited by mere skeletons and ghosts.’ The few survivors were overwhelmed, unable to bury the countless corpses that littered the streets, and the stench “was too much for most people to bear”.   Shocked as they might be, the crusaders showed little pity, much less remorse. Soon enough, wrote Jones, “Christian thieves” began to “run around taking what they could” and force starving Muslim children to undergo baptism.

When Jerusalem fell to the raid of a Khwarizmian (花刺子模) mercenary army of Ayyubid sultan in 1244—only 15 years after Fredrick’s diplomatic victory—it was utterly devastated. The Khwarizmians hunted down and slaughtered six thousand Christian civilians trying to flee the abandoned city. Then, on August 23,

the Khwarizmians entered the almost empty city of the Israelites and in front of the Sepulchre of the Lord they disemboweled all the remaining Christians who had sought refuge inside its church. … The marble around Christ’s tomb was either smashed or scavenged and the tombs of all the crusader kings of Jerusalem buried near Calvary were opened and their bones tossed away. Elsewhere other highly revered Christian churches and shrines received the same treatment: the priory at Mount Sion, the tomb of the Virgin Mary in the valley of Jehosophat and the Church of the Nativity in Bethlehem were all desecrated.

Ironically, the Khwarizmians were themselves victims at the hands of an even more formidable force. About 25 years earlier, the horde of Genghis Khan had besieged and pillaged Samarkand, the capital of their empire.   In some sense, he was indirectly responsible for the terrible losses of Christians in 1244, as the collapse of the Khwarizmians empire had left its jobless soldiers to scatter around, much like a deadly shock wave sweeping through the Middle East.  The Mongols, of course, did not discriminate between Christians and Muslims.  When they captured Baghdad, arguably “the most civilized of cities” at the time, they killed at least 100,000 Muslims.   Yet, their worst crime against humanity was probably destroying the great city’s House of Wisdom, a library that “contained the largest and most sophisticated collection of books on earth” – so many books were thrown into the Tigris, wrote Jones, “that the water was said to have flowed black with ink.”    

No medieval horror movie would be complete without mentioning the hideous crimes against Jews.  In fact, the First Crusade marked a tragic turn in the fortunes for Jewish diaspora in Western and Central Europe.

In 1096, even before leaving their own country for the First Crusade, French and German crusaders turned on local Jewish communities.  At Mainz, they stormed the residency of archbishop Ruthard where seven hundred Jews sheltered for his protection.  The indiscriminatory slaughtering by this mob was so appalling that many desperate Jews killed each other to avoid execution by the “weapons of the uncircumcised”.  Similar mass murders took place elsewhere.  In Cologne, according to Jones, “young men and women threw themselves into the Rhine and fathers killed their children rather than see them fall into the hands of the enemy”.   This “orgy of anti-Semantic violence”, collectively known as Rhineland massacres, is widely seen as a harbinger for what was coming for Jews in Europe in the next millennium.

About a hundred years later, the fervent zeal ignited by the Third Crusade engulfed the English populace. Months of riots against England’s Jews ensued.  During this period, it was not uncommon to witness mobs chasing and assaulting Jews in the streets, forcing them into coerced baptisms.  The worst incident occurred in York in March 1190, when hundreds of Jews, seeking refuge in the city’s castle, were either killed or forced to commit mass suicides.  The persecution of Jews in England would continue and culminate in 1290, when the country officially expelled its Jewish population and enacted a ban that would last nearly four centuries.

Shortly after I finished reading “Crusaders”, on October 7th, 2023, Hamas militants perpetrated the worst mass murdering of Jews since the Holocaust.  There is no need to recite the details of the crimes.  Anthony Blinken, the US Secretary of State, summed it up well: “depravity in the worst imaginable way”.   Viewing this incident in the context of crusade, however, I felt that I have seen the movie before. The latest version is set on the same stage and has a similar plot, though played by different actors.  In this movie, it was Jews, rather than Christians, who were the infidels that Muslims tried to expel from the land they believed was righteously theirs.  

History has never stopped projecting the conflicts in Palestine through the lens of the Crusades.  When British general Edmund Allenby marched into Jerusalem as a victor in 1917, ending the four-hundred-year control of the Holy City by the Ottoman Turks, he proclaimed, allegedly, that “the wars of the crusades are now complete”.   Whether he said it or not, the forecast was wrong. The British mandate of Palestine would give way to the rebirth of the Jewish state in what many Muslims saw as a continuation of the medieval crusades, only this time Jews and Christians were co-conspirators. Surely that was how Osama Bin Laden saw it. In the ‘Letter to the American People’, now widely circulated thanks to Tik-Tok, he wrote,

Palestine has been under occupation for decades, and none of your presidents talked about it until after September 11. … You should be aware that justice is the strongest army and security offers the best livelihood; you lost it by your own making when you supported the Israelis in occupying our land and killing our brothers in Palestine.

Likewise, President George W. Bush once likened the US response to the 9/11 attack to a crusade, warning the American people that “this crusade, this war on terrorism, is going to take a while”.  

Even the rhetoric sounds eerily similar, and it always invokes some version of a just war, i.e., the “violence that was regrettable but legitimate and even moral, so long as it was undertaken to protect the state and would ultimately serve to produce or restore peace.”  Bin Laden put it more bluntly, “it is a sin to kill a person without proper, justifiable cause, but terminating his killer is a right.”  What remains unsaid and perhaps unknowable, however, is who gets to decide what causes are proper and justifiable, and how far back in history one must trace them.

Hence, the life-and-death struggle for the Holy Land, waged in the name of that One True Faith, has never really ended. And the idea of crusading will perpetuate cycles of violence and plight as long as there are crusaders on Earth.

 

Marco Nie, Northwestern University

December 30, 2023

 

 

The song of Achilles

I read The song of Achilles about two years ago, wrote a short review then but never got the chance to post it here.  This is one of the few fiction books I have read cover to cover  since I turned 40 – thanks to my daughter’s recommendation.


My 11-year-old daughter fell in love with Greek mythology lately and has filled her bookshelf with the likes of Percy Jackson and Trials of Apollo.  Frustrated with my complete ignorance of the subject, she tried repeatedly to get me to read some of her books.  She marveled at The Song of Achilles all the time and insisted I must read the book because it is simply “too good” to pass over.  Eventually, I caved in despite my reluctance—novels have largely ceased to interest me, let alone a novel about Achilles, whose story has become a cultural cliché, even in China. Who could forget about the heels that his mom famously failed to wash in the magic spring?

It turns out I enjoyed the book more than I thought I could.  Madeline Miller made me constantly guess the theme of the book, but she managed to outwit me at every turn.  Initially, it seems that the book is about the love between two young men: Achilles and I the narrator (Patroclus). Then, I thought the focus is the insanity of the Trojan war, and how it transforms an innocent boy into a monstrous killing machine.   At one point, Miller mocked nationalism and advocated humanitarian principles, when she proclaimed through Chiron (a centaur) “nations were the most foolish of mortal inventions” and “no man is worth more than another, wherever he is from”. Eventually, I realize the central plot may be the ancient conflict between a jealous mother and her son’s spouse (a son-in-law in this case).  Achilles’s mom, Thetis, refused to endorse his relationship with Patroclus till the very end, even after they are buried together.   In the eyes of the jealous mom, Patroclus is an unattractive mortal unworthy of Achilles, a man who cannot bear an offspring for him, and above all someone who committed the unforgivable sin of sharing the love of her son.  But more fundamentally, Thetis and Patroclus fought hard to bring about a different Achilles in the book: Thetis wants a god-like, ruthless warrior, while Patroclus prefers an empathetic, creative human.  It seems to me this discrepancy, not the Prophecy, finally sealed the tragic fate of the couple.

Having finished the book, I must say I don’t quite understand why my daughter and her friends like it so much.  It is a book written for adults, with contents that I imagine some parents might find objectionable for kids of her age.  I know for a fact in my generation such a book would be considered off limits for 11-year-old. But, hey, we live in a different age, don’t we?