Category Archives: Book review

Games without rules

Before August 2021 I knew almost nothing about Afghan history. Nor did I care.  As a country, Afghanistan seems neither interesting nor important, culturally or geopolitically.  Yes, it is famous for feverish Islamism, extreme poverty, and brutality against women; but there are plenty of such failed states to go around in the world.  Yes, it is nicknamed the “graveyard of empires”; but to most Chinese, there is nothing mysterious about burying empires in what Chairman Mao would call “boundless ocean of people’s war”.

Then, in April 2021, President Biden announced the plan to withdraw from Afghanistan by the end of August that year.  Shortly after, Taliban soldiers began to emerge from caves and tunnels. As they swept through the country with breathtaking speed, their opponents, more than 300,000 strong and trained, equipped, and paid for by NATO, simply melted away.  To be sure, Americans did not think highly of the Afghan legions procured with their money, initially predicting they could not hold off Taliban offense for more than a year.  Yet, they were still caught completely off guard when the Afghan government collapsed in Mid-August, well before the deadline of the planned withdrawal. If Americans had dreamed about a gracious if melancholy farewell from a country that they thought they had liberated and rebuilt, the dream had turned into a nightmare that will be remembered for generations to come.

Like most observers, I watched the events unfolding in Afghanistan that summer with shock, amusement, and confusion.  How could a poorly trained guerrilla force defeat a larger, better-equipped national army in just a few months? Why did not most Afghans fight harder to protect their political freedom, personal liberty, and women’s rights, the things that Americans insisted they should cherish the most? Even Biden seemed genuinely baffled at Afghans’ lack of will “to fight for their own future” despite Americans had given them “every tool they could need”.  These questions had prompted me to find answers in Afghan history.   The book I stumbled on was Games Without Rules by Tamim Ansary, an Afghan American author who was born in Kabul after WWII. Ansary covers the 250-year history of modern Afghanistan, starting from its legendary founder, Ahmad Shah Baba, and ending with the Islamic Republic in the 21st century.   An easy and enjoyable read, the book did not just answer most of my questions, it answered them head on, as if the author knew the questions would be asked ten years later.

First, a few things that surprised me.

I once thought that Afghans have always been living under a somewhat barbarous regime similar to Taliban, and that it was Americans who incidentally liberated them from the subjection by their antiquated institutions.   I was wrong.

Taliban movement was in fact a new phenomenon that bears little resemblance with most Afghan regimes that came before it.  The reign of Abdur Rahman Khan (1880-1901) ––also known as the Iron Amir––may be a close match in terms of brutality and religious rigidity, but he is also remembered by many as the king who united Afghanistan under one flag and set her on the path toward modernization.  Like many peoples that came in contact with the West in the past two centuries, Afghans had gone through, sometimes not under their own initiatives and terms, multiple iterations of modernization projects.  Amanullah Khan (1919 -1929), who fought for and won Afghan independence from the British Empire, was a radical reformer.  Among his daring edicts was a new law meant to replace Shari’a, which guaranteed many basic human rights, including freedom of religion and women’s rights – yes, a hundred years ago, Amanullah’s code already proclaimed no girls should be denied the right to education and no women should be required to wear burqa.   However, Amanullah’s reform was way ahead of its time.  Afghans rebelled and kicked him out of the country; he ended up in Italy as a refugee, where he spent the rest of his life working as a carpenter.  After a few years of turmoil, the reign of Zahir Shah (1933 – 1973) charted a more moderate and successful trajectory, which culminated in the enactment of the 1964 constitution.   By introducing free elections, a parliament, civil and political rights and universal suffrage––and effectively banning any members of royal family to hold high-level government offices––the constitution created a modern democratic state that is, in principle, similar to the Islamic Republic of 2000s. By early 1960s, Ansary wrote,

“in the big city of Kabul, women were beginning to appear in public showing not just their faces but their arms, their legs, even cleavage. Afghan girls of the elite technocratic class were beginning to cotton to Western fashions. They were wearing miniskirts and low-cut blouses. Nightclubs were popping up, which served beer and wine and whiskey—and not just to foreigners. Afghans were drinking and making no bones about it.”

So, how did Afghanistan descend from this lovely modern democracy to Taliban’s Islamic Emirate? Well, it had much to do with geopolitics.

Contrary to my naïve preconception, Afghanistan has been enormously important to the struggles of great powers, especially those between Russia and the West. In the 19th century, the Russians attempted to reach the Indian Ocean from the central Asia. Determined to protect their enormous trade interests in the region from Russian interferences, the British took Afghanistan as their protectorate by force.  If the objective was to stop Russians, the British succeeded.  However, their control of the country had always been fragile and treacherous.  According to Ansary, they had “won jurisdiction of every patch of Afghan territory their guns could cover—but not one inch more”. Eventually, after countless lives on both sides lost to violence and a world war that permanently weakened Europe, the British granted independence to Afghans. However, the domination of great power politics did not fade away. Instead, it morphed into a form that had briefly become a benefactor, when Russians and Americans, in their attempt to recruit Afghans to fight for their causes in the Cold War, offered extravagant aid packages.  In 1950s and 1960s, the two superpowers “constructed over twelve hundred miles of superb paved roads through some of the planet’s most difficult terrain”, which connected “all of Afghanistan’s major cities”. Unfortunately, this relatively peaceful and prosperous era was interrupted by the rise of the communist movement in the late 1960s.   Social unrest ensued, followed by three coup d’etat in the 1970s.  From the upheavals a deeply unpopular communist regime emerged in 1978, whose internal strife soon killed its pro-Soviet leader, Nur Mohammed Taraki, and forced his slayer and successor, Hafizullah Amin, to consider jumping ship to the Americans. The Soviet Union intervened, plunging into a 10-year war from which she would never recover.  Like the British in the 19th century, Russians soon discovered that their war machine could easily crash the Afghan army and state but not the Afghan people.  Frustrated by the tenacious opposition led by Mujahideen (Islam Jihadists), the Russians resorted to a scorched earth policy that aimed at depopulating rural Afghanistan. Their grotesque tactics did little to win the war but unleashed a humanitarian catastrophe of epic proportion.   According to Ansary, a million Afghans were killed and six million displaced in 1985 alone.

An entire generation of Afghan boys would grow up in the refugee camps and receive education in religious madrassas (schools).  Having suffered through the worst childhood on earth, they were “allowed to imagine that it might be their destiny to establish the community that would save the world”.   From the schools of these refugee camps would rise the loyal followers of Mullah Omar, the founder of the “student movement”, or Taliban (literally means students in Arab).   Under Omar’s leadership, Taliban would win a bloody civil war in 1990s, only to be dethroned a few years later in the wake of America’s anti-terrorist crusade.

The rest is history.

Let me get back to the questions that drew me to this book in the first place. Why didn’t the Afghan people fight harder for their freedom? The short answer is there were two Afghan peoples: westernized urban elites and common folks from the countryside. The “Afghan people” often spoken of in the western media might only refer to the former.  While the elites considered Taliban an archenemy, the masses did not see Taliban’s moral and religious imperatives conflict with theirs.  While the elites were supposedly in charge, they have never gained full control of the other Afghanistan.  Most importantly, when push comes to shove, they had no idea how to “fight the fight and win the war”.

Why is Afghanistan so deeply divided?  As a collection of tribes and ethnic groups that loosely coalesced around an Islamic culture over a tough terrain, Afghanistan is an inherently weak state. This made it very hard for anyone, even the most powerful country in the world, to penetrate through the layers of physical and cultural barriers that historically separate urban centers from rural communities. Without a strong state, most Afghans naturally turned to tribal and religious authorities for such basic state services as security, law enforcement and education. Ansary likened ruling Afghanistan through a puppet government to swinging a pot by grasping its handle: the foreign powers thought they could swing the pot however they wanted; yet, because the handle was never firmly attached to the pot, they often ended up shattering the pot while holding nothing but a useless handle.

The innate weakness of the Afghan state was further reinforced by the powerful legacy of Islam and the recurrent interventions by the West.  Unfortunately, the Islam and the West have long been at odds with each other, and the animosity had only grown stronger in the past century.  As a result, the head of the Afghan state faces a constant dilemma.  On the one hand, as they need the support of the West––money, permission, or both––to secure power and to modernize the country, they must subscribe, or at least pay lip service, to Western values.   On the other hand, they could not afford to alienate the masses who remain loyal to traditional values, or risk being thrown out of the palace like Amanullah.  The balance between the two acts is so delicate that few could make it work, not for a long time anyway.  As a result, modernization in Afghanistan, because it is “foreign” in name and in essence, had actually widened the cultural and wealth chasm between the elites who welcomed the western influences and the masses who continued to resist them.  Any attempt by a foreign power to correct course by direct intervention, regardless of methods or intention, only serves to pour fuel on the fire.

Seen from this light, the Bush plan to rebuild Afghanistan after the 2001 invasion was doomed from the beginning.   On display in that 20-year nation building project, largely funded by American taxpayers, is not so much America’s idealism as her arrogance and ignorance of history.  Biden was right to cut the loss as soon as he could.   In the end, Ansary told us Afghanistan would probably do okay, regardless of who was in charge, if only other countries are willing to leave her alone.   Let’s see if the world will heed his advice this time.

 

Idea of History

I learned about R.G. Collingwood and his famous book from a Chinese podcaster who quoted Collingwood as saying, “All history is the history of thought” (in Chinese, 一切历史都是思想史). Struck by the profoundness of the quote, I decided to dig deeper.  Collingwood is known as the most underrated philosopher in history, a reputation largely earned by “The Idea of History”. The book was published posthumously after his premature death in 1943, at age of 53.

By “all history is the history of thought”, Collingwood means history can only exist in the re-enactment of the past in a historian’s mind. The past events are over, cease to exist, and hence cannot be perceived and studied as a real, actual object. Thus, history is knowable only by thinking, and the proper object of history is thought itself: “not things thought about, but the act of thinking itself”.   It follows, I believe, there is no such a thing as the true past, or the real history.  History is idealistic in nature.  Seen in this light, the translation—“一切历史都是思想史” —is misleading. The quote should rather read, “一切历史都是思考史“。

Collingwood believes that a historian must go beyond the materials inherited from authorities.  Otherwise, he is a mere “copy-and-paste” historian. Collingwood goes so far as suggesting history, like novel, is the work of imagination, and in this regard, they do not differ.  The historian must tell a coherent and believable story in which the actions of his characters are justified by circumstances, motives, and psychology.  I suppose Collingwood’s novelistic historian is in sharp contrast with most Chinese historians, who actually praised and cherished the copy-and-paste tradition, sticking to Confucian’s famous precept: 述而不作(pass on the wisdom of the sages without adding anything new to it).

Collingwood argues the purpose of history is to inform the present, by revealing “what man has done and thus what man is”.  Reconstructing the past is always done to know the present and to tell us what to do in the present.  Moreover, the past and the present are the same object in different phases and therefore inseparable: we come to know the present naturally by studying the past, because the past is part of the present.

Collingwood believes all history is biased because everyone approaches history with their own biases. Indeed, if it were not for these biases, nobody would write history in the first place.   He does say a good historian must take no sides and “rejoices in nothing but the truth”, but how much of history is written by good historians?

Finally, Collingwood harshly criticized the “scientific” theories of universal history, i.e., the idea that the progress of human history is governed by some universal law.  Chinese students of my generation can attest this is exactly what we had learned in history classes.  According to Collingwood, the value of these theories “was exactly nil”, and, if they have been accepted by so many, it is only because they have “become the orthodoxy of a religious community”.  He claims only two types of people were still writing universal history at his time: the dishonest attempting to “spread their opinions by specious falsehoods”, and the ignorant naïvely writing down everything they know without “suspecting that they know it all wrong”.

To someone growing up in China where historical materialism is treated as the one and only truth, Collingwood’s idea seems like heresy at first glance.  However, the more I read, the more I agree with him.  Since much of the book was compiled from lecture notes, the experience is close to taking a philosophy course: not exactly fun but worth the effort.

 

On Liberty

I have heard and read about John Stuart Mill many times before but have never read him. Known as the most influential English-speaking philosopher of the nineteenth century, he still has many followers and admirers in the new millennium, even in some intellectual circles in mainland China. For example, Xiang Luo (罗翔) – the famed Chinese law professor who had gained an incredibly strong following on Internet because of his lucid and witty analysis of contemporary legal matters – is evidently a Mill’s fan.   I was reluctant to read Mill, or for that matter any philosophers who lived two centuries ago, as I wasn’t sure I could understand, much less enjoy, their writings.  However, after reading a blog by Luo that passionately praises On Liberty, I decided to at least give it a try.  I’m glad I did.

One of the most important works on political philosophy, On Liberty explains what constitutes liberty, why society must guarantee it, and how to resolve the conflict between liberty and order. Mill’s central argument is that a civilized community should not exercise power over its members against their will except for the purpose of preventing harm to others.  In his own words,

“The only freedom which deserves the name, is that of pursuing our own good in our own way, so long as we do not attempt to deprive others of theirs, or impede their efforts to obtain it”.

This doctrine, known as the harm principle, bestows each person a virtual sphere, whose boundary may be described by the adage, “my right to swing my fist ends where your nose begins”.  The individual is sovereign over themselves within this sphere, which Mill divides into three compartments: (i) the liberty of conscience, including thought, feeling, opinion and sentiment on all subjects, (ii) the liberty of planning one’s own life according to one’s tastes and character; and (iii) the liberty of uniting with other consenting individuals.

Per the harm principle, the US government seems to overstep its authority by outlawing prostitution, gambling, and drug use.   The government may consider these activities immoral and dangerous, even decidedly harmful to a person who engages in them, but still the person should only be warned of the danger, not forbidden from exposing themselves to it.  In fact, Mill thinks even commercializing such activities – say working as a pimp or selling drugs for a profit – may fall into the realm of individual liberty, so long as those activities themselves are admissible (under the harm principle, they surely are).

It should be noted that harm is a necessary but not a sufficient condition for interference.  Any competition for a scarce resource – admission to Ivy League colleges, election to political offices, tickets to Taylor Swift’s concert, to name a few – necessarily produces winners reaping benefits at the expense of losers. Do the winners thus harm losers, materially and/or psychologically?  Mill asserts such a claim would be valid only if the winner has employed “fraud or treachery, and force”.   Nor harm to others must be caused by actions.  A person can be held accountable for the harm attributed to their inaction, too, though compulsion against such offense must be more carefully exercised.  A somewhat surprising example given by Mill is parents failing to provide their children with the “ordinary chance of a desirable existence”. That is, the failure at parenting is not just a family tragedy, but a crime against the children and society. In fact, Mill has gone so far as suggesting couples who cannot show they have the means of raising children properly should be denied the right to marriage, effectively denying them the liberty to unite with others.

Mill would probably be called a free speech absolutist if he lived today. Expression of any opinion by any fringe group, in his mind, must be tolerated and protected, no questions asked.  To drive home this point, he writes,

“If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.”

Mill does not believe being offended by another person’s conduct or speech is an injury that warrants redress.  To him, the feeling of a person for their own opinion carries much more weight than the feeling of another who finds their holding it hurtful or offensive. If hate speech was a thing back then, Mill would be inclined to protect it too. He would be dumbfounded upon learning that Larry Summers, the former president of Harvard University, was forced to resign simply because he offered a seemingly innocent explanation of women’s underrepresentation in science and engineering.  The only qualification to the freedom of speech Mill would agree is that it must not incite violence.  For example, “an opinion that corn-dealers are starvers of the poor… may justly incur punishment when delivered orally to an excited mob assembled before the house of a corn-dealer”.   This example seems to fit well with the speech that Donald Trump gave to the mob that gathered in front of White House on January 6th, 2021 –– whether Trump was intended to stop a proceeding of US congress by force or not, the mere presence of a mob that could heed his words means the speech has violated the harm principle.

Mill had more than a healthy dose of skepticism about democracy.  He appears to suggest self-government is an illusion because there is no such a thing as “the government of each by himself”, but only the government “of each by all the rest”. The will of the people spoken of, similarly, is the will of the majority, not the will of everyone.  Mill is wary of society hindering the development of individuality by compelling its members to adopt its own ideas and practices as the rule of conduct.  This tyranny of the majority, he contends, can be more oppressive than an actual tyrant, because “it leaves fewer means of escape, penetrating much more deeply into the details of life, and enslaving the soul itself.”

At times Mill sounds like a staunch elitist.  Deriding the mass as “collective mediocrity”, he warns us the danger of allowing the mass to take their opinions from “men much like themselves, addressing them or speaking in their name, on the spur of the moment, through the newspapers”.  Instead, to rise above mediocrity, the mass must be “guided by the counsels and influence of a more highly gifted and instructed One or Few”.  Exactly who these geniuses are Mill did not specify.  I don’t think he meant elected officials, since no elected official in a democracy, including the president of the US, could ever hope to achieve this level of potency.

Liberty is not a natural right, according to Mill. He made it clear the people who are incapable “free and equal discussions” have no use for it. These “barbarians”, as Mill calls them, should consider themselves lucky if they can find a competent despot – “an Akbar or a Charlemagne” – to be their ruler.  Instead, Mill justifies liberty by its utility. Freedom of speech is indispensable because it guarantees “the opportunity of exchanging error for truth”.   Even if an opinion is wrong, we would gain, by giving it a fair hearing, a better understanding of truth “produced by its collision with error”.   As Mill puts it eloquently,

“he who knows only his own side of the case, knows little of that.”

Moreover, liberty fosters individuality, which is instrumental to human progress.  A civilization becomes stationary, Mill asserts, the moment it ceases to possess individuality. He argues the emphasis on conformity at the expense of individuality is the main reason why China fell so behind the West at the time of his writing (twenty years after the first Opium War).   China enjoyed “a particularly good set of customs” from early on, thanks to the talent and wisdom of a few “sages and philosophers”.  Yet, her attempt to “impress the best wisdom upon every mind in the community” backfired because it ended up imposing the same maxims and rules on everyone’s thoughts and conduct, thereby eradicating individuality.  Remarkably, Mill’s analysis still rings true in today’s China.  Growing up in 1970s and 1980s, I remember being taught that the best I can do for the nation is to become a “revolutionary screw” (革命的螺丝钉).   The word “revolutionary” might have been slowly phased out since then, but the metaphor has not. China still sees her citizens as standard parts on a well-oiled machine: indistinguishable and insignificant as individuals, but harmonious and powerful put together – or so she hopes.  In the past two centuries, China had tried to reinvent herself but insisted to do it her own way for so many times that Albert Einstein might think she was insane, as in “doing the same thing over and over and expecting different results”.  Will she succeed this time around?  I don’t know, but I will leave you with one of my favorite quotes from On Liberty (the emphasis is mine):

“A State which dwarfs its men, in order that they may be more docile instruments in its hands even for beneficial purposes, will find that with small men no great thing can really be accomplished”.

World order

World Order is about the philosophy of international relations.  Kissinger argues that any stable system of world order needs both legitimacy, which is a belief about what constitutes a just order, and power, which is what holds the order together to keep peace.  In this view, power and legitimacy are interdependent: power is unsustainable without legitimacy, and legitimacy cannot maintain order without power.   The key is how to strike the right balance. Using this theoretical framework, Kissinger analyzes how the power-legitimacy equilibrium played out in four systems of historic world order.

The bedrock of world order before 1945 was the so-called “Westphalian system”, named after the Treaties of Westphalia, which ended the Thirty Years War in 1648.   The war was largely fought to settle the legitimacy of Church’s monopoly over individuals’ spiritual relationship with God, and yet, its sheer destruction had convinced Europeans to never again center world order on moral authority.  Instead, the focus was shifted entirely to the allocation and balance of power with value-neutral rules, such as mutual respect for the sovereignty of states and noninterference in domestic affairs of other states. It goes without saying that these rules only apply to the states wielding enough power to tilt the order off balance.

If the Westphalian system is all about power, the Islamic order is all about legitimacy.  Islam divides the world into the land of believers and the land of infidels.  Islamists consider themselves permanently and automatically at war with the world inhabited by unbelievers, and Jihad—the mission of expanding Islam faith through struggles—the only way to bring peace to all humanity.  They reject any other form of legitimacy because only Islam can offer the true form of freedom, the “freedom from governance by other men and man-made doctrines”. This feverish commitment to religious imperatives inevitably denies the reality of power dynamics, often with grave consequences. Kissinger noted how it has, for example, “turned coexistence with Israel from an acceptance of reality” into an irreconcilable conflict with their own legitimacy for many Arab governments.

Like Islamism, Confucianism refuses to recognize any sovereigns as legitimate unless they are subordinate to the Chinese emperor, who supposedly rules everything “beneath the sky” with the Mandate of Heaven.   There are two important differences, however. First, the Mandate of Heaven is not sanctioned by God, but hinges on the ruler’s willingness and ability to provide good material life to the ruled. Second, China seeks respect, not conversion by force.  Instead, the “barbarians” are given a rung on her ladder of tributary, according to proximity to Chinese culture. Therefore, as Kissinger observed, there is no need “to order a world it considered already ordered, or best ordered by the cultivation of morality internally”.    To a certain extent, the current regime in China still sees the world the same way: it claims legitimacy from ever-increasing standard of living for its people, and it seeks to dominate not necessarily by physical force but by its achievements and conduct.  On paper, China has adopted the Westphalian system since 1949, as evidenced by her commitment to the five principles of peaceful co-existence.   That, however, is a practical accommodation to reality, not a reflection of Chinese ideal.  Chairman Xi’s vision of China Dream, vague as it may sound to a foreigner, precisely expresses a national nostalgia for that glorious past, real, and imagined, in which Chinese can pretend the world orbits around them for eternity.  That said, I think the threat of that vision to world peace has always been exaggerated in the West.  The image of an expansionist and missionary China is largely a mirage created from—depending on your propensity for cynicism—either a misunderstanding of or a disagreement with her preferred form of world order.

It would surprise no one that Kissinger thinks Americanism is our best shot at creating an optimal world order, though he made it clear there is still room for improvement.  As the cliché goes, America started with an idea.   That idea, I think, is as much about liberty and democracy, as about the American insight of world order.  Americans like to think they always place “principles” before “selfish interests” when it comes to world affairs. They are not only exceptional in this regard, but also destined to bring the vision to humanity.  As Thomas Jefferson put it, “it is impossible not to be sensible that we are acting for all mankind”.   Until Woodrow Wilson, however, America refrained from imposing her order on others. Instead, she contented herself with an exemplary role, as “the shining city on a hill”.  Ronald Reagan loved to talk about the shinning city, and his depiction of it is simply too good to pass over:

“…in my mind, it was a tall proud city built on rocks stronger than oceans, wind swept, God blessed, and teeming with people of all kinds living in harmony and peace—a city with free ports that hummed with commerce and creativity, and if there had to be city walls, the walls had doors, and the doors were open to anyone with the will and the heart to get here. That’s how I saw it, and see it still.”

To the extent this metaphor advocates leading by example rather than conquest, it bears a resemblance to how China sees her role in the world.  It was under Wilson’s watch that America began to embark on the mission to remake the world in her own image.   To Wilson, democracy was the source of legitimacy because it is both the best form of governance and the sole guarantee for permanent peace.  Thus, only by spreading democracy far and wide can humanity hope to resolve conflicts, achieve the equality of all nations, and maintain world peace and universal harmony.  This vision, Ironically, is not that different from Islamism, in terms of the end goal (world peace), the claim to an absolute moral truth and the pledge to convert “unbelievers”.  To be sure, America does not openly threaten to wage wars against unbelievers, opting instead to pressure tactics and sabotage campaigns.  Yet, she frequently found herself at war with them, not always supported by an airtight casus belli fully consistent with her “principles”.  Therefore, while in theory America dismisses any calculations of the Westphalian style balance of power as immoral and dangerous, in practice she always reserves for herself the right to embrace such a calculation on an ad hoc basis. Kissinger apparently thinks this ambivalence is a feature, not a bug, of Americanism, as he writes,

“America’s moral aspirations need to be combined with an approach that takes into account the strategic element of policy in terms the American people can support and sustain through multiple political cycles.”

In other words, the art of practicing Americanism is to find that delicate balance between power and legitimacy, which is probably best illustrated in the famous (or infamous) American doctrine of strategic ambiguity on defending Taiwan.   The danger, however, is that Americanism can be seen as opportunistic, if not hypocritical.  The lack of transparency and consistency has and will continue to enable America’s enemies to argue that she is, after all, no better than the value-neutral, power-centric imperialism that she purports to displace, and that her professed love for human rights, democracy and peace is but national interests under a fancy new dress.

If you are into geopolitics, you may find this book a real treat.  In essence it is a condensed world history, viewed through the lens of world order and filled with interesting details, anecdotes, and quotes that I truly enjoyed. Kissinger had turned 90 when the book was published in 2014, but he remained a cool-headed, clear-eyed, and elegant writer.   Perhaps more importantly, he was still the passionate believer and defender of Americanism, who refused to say anything negative at all about any of the twelve postwar presidents of the United States.  This lack of self-reflection is somewhat disappointing but understandable given Kissinger was far from an impartial analyst of America’s world order.

Capital in the 21st century

“Capital in the 21st Century” explains how the distribution of income and wealth (or capital) evolves according to the laws that govern economic growth, rate of saving, and returns on capital.  Picketty argues quite convincingly the reproduction of capital tends to outpace that of the economic output (GDP).   Once set in motion, therefore, capitalism inevitably concentrates wealth, creating a self-reinforcing spiral that ends with appallingly unequal distribution of wealth.  By Picketty’s estimation, the accumulation of wealth in the developed countries has by 2010s returned to a level that the world has not seen since the eve of the World War I.  As a shocking symptom of this extreme inequality, the bottom 50% of population collectively own close to nothing everywhere, including Sweden!  If this process is to continue indefinitely, he warns, “the past will devour the future” and we will return to a society of “rentiers dominant over those who own nothing but their labor”.

It is tempting to reject out of hand Picketty’s thesis as Marxism dressed in a new costume, and his laser focus on inequality a dangerous rhetoric tacitly inciting class warfare.  Such an interpretation would be unfair, however.  His main argument against capital concentration, I think, is not a moral one.  Rather, the concern is concentrating so much capital in so few hands may be socially destabilizing. Moreover, as the stock of capital continues to grow relative to GDP, the returns on capital may eventually converge to the rate of economic growth, at which point capitalists must reinvest all income from capital in order to merely preserve “their social status relative to the average for the society”. This last point, known as the golden rule, seems to me an ultimate manifestation of involution (内卷).

How do we avoid such apocalypse then?  Picketty’s innovation is a tax directly levied on capital, including all financial assets such as (unsold) stocks and bonds.   As utopian as it might sound, the idea has gained traction in mainstream politics lately.  The Democratic Party in the US, for example, recently proposed a “billionaire tax” to pay for Biden’s ambitious social spending programs.  Conceptually, the billionaire tax is exactly a tax on capital, though it limits the taxation base literally to “billionaires”. While this proposal died quickly, I suspect similar attempts will resurface in the future, if only as a novel revenue source for desperate governments.

Picketty is an articulate and persuasive writer, and “Capital” is absolutely worth reading.  As a side note, I found his open refusal to recognize economics as a science remarkable and laudable.  My jaw almost dropped when I read “the discipline of economics has yet to get over its childish passion for mathematics”. With that kind of candidness, I am sure the book did not win Picketty many friends in his profession; but whether you agree with him or not, you must admit to “tell it as it is” in such a dramatic fashion requires conviction, courage, and integrity.

From SPQR to one-man rule

Mary Beard’s SPQR—which stands for “Senate and People of Rome”—covers the first thousand years of the Roman Empire, running from the legendary founding of the city in 753 BCE to 212 CE, when Caracalla extended citizenship to all free men living within the Empire.  Her narrative anchors at the time of Caesar and Cicero (i.e., the middle of the first century BCE), which not only saw Rome’s transition from a republic to one-man rule, but also produced a significant body of literature, including a huge volume of Cicero’s writing. Coincidentally, this period largely overlaps with the glorious thousand years of Ancient China, between the Spring and Autumn Period, officially commenced in 770 BCE, to the end of the Eastern Han Dynasty (220 CE).

You would be thoroughly disappointed if you look forward to reading the colorful stories of the famed Roman tyrants or the virtuous deeds of the five good emperors.  Beard refused to reconstruct Roman history in terms of the biographies of the rulers.
She is skeptical of the accuracy of their “standard images” passed on to us in historical records. More importantly, she does not believe “the qualities of the man on the throne” would make much difference, because all emperors, from Nero to Marcus Aurelius, ruled according to the same blueprint laid out by Augustus.  Her sentiment reminds me of an Afghan proverb I recently came across,

“Better a strong dog in the yard than a strong king in the capital”.  

Accordingly, Beard’s portrait of Rome focuses on ordinary Romans.  She depicts with vivid details the Roman way of life, from where Romans live, what they eat, to how they commemorate the dead; she describes every facet of the society, from politics, entertainment and personal finance to law enforcement and war.  Beard’s stories are always carefully backed up not only by the writings of contemporary Romans, but also by rich archeological records – many of which I’ve never known exist.    I very much appreciate her deemphasizing royal résumés and court intrigues. However, I’m not sure all emperors are as useless or harmless as she insists. It may be true that emperors had limited influences on the daily life of any ordinary peasant or aristocrat.   However, overly ambitious despots or utterly incompetent idiots could still, without great labor, throw their empires into cataclysm and destroy millions of lives in the wake. This is especially true for many Chinese dynasties, where an ever-present, sophisticated, and layered bureaucratic system could impose laws and extract resources in nearly every corner of the empire.

Beard seems to agree with Polybius—a Greek who wrote in the second century BCE a 40-volume book entitled “Histories”—that Rome’s rapid ascent to hegemony should be credited to the idea of checks-and-balances embedded in her political system.  Seeking to maintain a delicate equilibrium between consuls, the senate, and the people, the idea had influenced the United States’ constitution so much that it remains the emblem of her politics to this day.  However, I suspect Polybius had made a common mistake in social science here: extrapolating incomplete patterns into a specious theory. On the other side of Earth, the Kingdom of Qin established the first Chinese empire in 221 BCE, 75 years before Rome became the master of Mediterranean on the ruins of Carthage.  Qin was a highly centralized monarchy founded on legalism, a political philosophy antithetical to the idea of checks-and-balances. Legalists argue the more concentrated the power is into the hands of the sovereign, the better. They advised the emperor that the people are not to be entrusted with liberty or right to participatory governance; instead, they must be ruthlessly exploited for the collective national interest––whatever that means––and to save themselves from falling victim to their own vices (hence the slogan “serving the people”). Cruel as it might sound, legalism had enabled Qin to conquer a vast territory by force and remake China in its own image. To be sure, the mighty Qin dynasty lasted only 15 years.  However, the polity it pioneered had survived for millenniums – some may argue it continues to this day. Thus, checks-and-balances is probably not the secret behind Rome’s unparalleled success.  Nor had it saved Rome from the populist strong men of the first century BCE – the likes of Pompei, Caesar and Octavian.

Beard notes “Roman emperors and their advisors never solved the problem of succession”.  Rather than sticking to primogeniture, Roman rulers often resorted to— sometimes forced by biology, as in the case of Julio-Claudian dynasty—a form of ambiguous meritocracy for choosing their heir. As a result, for the period covered in the book, only three emperors, Vespasian, Marcus Aurelius and Septimius Severus, had passed the throne to their biological sons.  Those who have watched the Hollywood movie “Gladiator” may remember the scene where Marcus Aurelius was murdered by his son, Commodus, who found out the philosopher emperor was about to name an able and wise general as the heir to the throne. I have not seen much evidence supporting this dramatized version of the fateful succession that upended the era of “five-good-emperors”.  In fact, Commodus was named the co-emperor––another strange Roman invention––at the age of 15 by his father.   Nevertheless, the Hollywood story captures the Romans’ ideal succession principle, perhaps best expressed in a speech delivered to the emperor Trajan by Pliny the Younger,

“If he is to rule over all, he must be chosen from all”.

To Pliny’s contemporaries in China––the elites of the Eastern Han Dynasty––the suggestion that an emperor should be chosen from all must sound absurd, if not blasphemous.  While the legend has it that once upon a time Chinese, too, chose their ruler by merit rather than birth, that nostalgic era of Yao-Shun-Yu (尧舜禹) had long gone by the time when Pliny wrote his speech.   The point, of course, was never about which succession principle is better, but rather no principle always works under one-man rule. As Beard points out, transferring the absolute power is an inherently unstable and dangerous business, and the moment when that power was supposedly handed on was “always the moment when the empire was most vulnerable.”  To this truth millions of people can still attest even today.

 

The End of Everything

I am always attracted to “The End of XXX”. “The End of Faith”, “The End of Time”, “The End of Physics”, “The End of History”, and the list goes on.   Thus, ever since a colleague of mine named The End of Everything his favorite book about astrophysics, I knew I must read it.   I was not disappointed.

The book is a layman’s guide to cosmology, with a focus on the death of the universe. Katie Mack explains that our universe could end in five different ways and she expects humanity to survive in none of these scenarios.  Of the five endings, Heat Death seems the most humane to me.  In it, the universe will continue to expand until it reaches a thermodynamic equilibrium, at which nothing, including life in any form as we know it, can ever happen again.  The other four endings, if I understand them correctly, all involve a cataclysm that, according to Mack, you will never want to live long enough to witness.

A book entitled “The end of everything”, of course, is inherently about eschatology.  Contemplating the end of the universe was surprisingly hard and strangely personal. In fact, I found it even harder than thinking of my own death. We humans often come to terms with death using the legacies we might leave behind: passing our genes on to next generations; making the world a better place; or better yet, enshrining our ideas in eternal knowledge.  However, if humanity itself will not survive the destruction of the universe, these justifications sound unconvincing.  “At some point, in a cosmic sense, it will not have mattered that we ever lived.” Mack tells us.  This comment reminds me of the famous quote from the movie Coco, “When there is no one left in the living world who remembers you, you disappear from this world. we call it the Final Death”. The end of universe is the Final Death of Humanity.

Mack then asks the obvious question: “What does that mean for humanity and where does that leave us now?”  In the epilogue, she tried but struggled to offer a satisfactory answer.  I could not come up with an answer either.  In fact, just thinking about it makes me feel sad. Indeed, when a colleague of Mack posed that question at an academic seminar, some people in the audience cried.

Mack is a great writer and communicator.  Her infectious passion for science and sharp wit makes reading the book a joy that I look forward to everyday. For the first time, I feel that I actually understand what dark energy or cosmic background radiation is.  Of course, I still have no idea about the Higgs Field or Vacuum Decay, but that’s probably on me.

A failure of capitalism

Richard Posner is said to be the most cited legal scholar of the twentieth century.  A Failure of Capitalism, a book published toward the end of his distinguished career, is not among the most cited of his scholarly works – not even close – but probably the most read, judged by the number of ratings on Goodreads.  “The failure” of capitalism concerned herein is the financial crisis of 2008, which triggered the Great Recession, the worst recession the world had ever seen since the legendary depression of 1929.  The book attempts to explain the causes of that crisis, who should bear the blames and what lessons we may collectively draw from the event.

Posner believes the main cause of Great Recession is the confluence of lower interest rates in 2000s and the over-deregulation of financial industry that began much earlier.  On the one hand, cheap credit encouraged the expansion of homeownership, which pushed up home price because the supply in real estate market is usually slow to catch up with demand.  Rising price convinced people that houses are good investment, thereby inducing more to dive in with money borrowed beyond their means to pay back unless home price continued to rise.  On the other hand, deregulation made it harder for traditional commercial banks to raise equity capital from demand deposit accounts, owing to the competition from investment banks and hedge funds alike. To stay in the game, therefore, banks had to rely more on borrowed short-term credit, increase their leverage (the ratio of debt to equity) and make longer term (hence riskier) loans. With the huge demand for credit fueled by the low interests of 2000s, this business model was pushed to the limit, exposing the entire industry to the risk of default in the housing market should the price begin to fall.  To mitigate these risks, banks invented complex debt securitization devices, including the infamous credit-default swaps.   In hindsight, however, these tools were not so much about reducing the risks as hiding them, unconsciously or otherwise.

Given this analysis, it is hardly surprising Posner argues the leaders at Federal Reserve and other economic agencies – Alan Greenspan and, to lesser extent, Ben Bernanke, among others – are culpable for a misconceived monetary policy and the lack of foresight for the impending crisis.  He also points fingers at the US government for its deregulation of the financial industry – driven largely by market fundamentalism – and the failure to prepare a contingence plan that would have avoided the “bumbling improvisations” in the initial response.   Posner is also dismayed at the “failure of the economics profession to have grasped the dangers”.  Many, including Robert Lucas – the most distinguished macroeconomist at the time – seemed to have been completely blindsided by the disaster.    Lucas had gone so far as to downplay the imminence of a recession as late as September 19, 2008, four days after the collapse of Lehman Brothers.     However, Posner argues his fellow academics deserve lenience for missing the warning signs that they were supposedly best poised to spot.  For one thing, they were not well equipped to “empirically test rival theories of depression” and were increasingly isolated in their own silos by ever-greater specialization. More importantly, doomsaying is a tricky and unpopular craft. As Posner points out, “Cassandras rarely receive a fair hearing”, because “it is very difficult to receive praise, and indeed to avoid criticism, for preventing a bad thing from happening unless the probability of its happening is known”.

Posner pushes back forcefully against the claim that the crisis had much to do with the stupidity or greed of bank executives and hedge fund managers.  Nor does he believe they should be held responsible for not heeding the warning signs of a gigantic bubble while taking seemingly undue risks to “ride it”.     “Riding a bubble can be rational”, Posner explains, especially when money is cheap.  More importantly, nobody really knows when a bubble, unattainably large as it might seem, will burst, and until it does one could still be making much money riding it than climbing off.   Indeed, being rational could be a losing proposition when the majority is irrational, as summarized in Keynes’ famous aphorism,

“the market can stay irrational longer than you can stay solvent”.

Posner believes the duty of mitigating systematic risks resides elsewhere (i.e., government), because

“it would make no more sense for an individual businessman to worry that because of the instability of the banking industry his decisions and those of his competitors might trigger a depression than for a lion to spare a zebra out of concern that lions are eating zebras faster than the zebras can reproduce.”

Posner writes beautifully, with a combination of clarity, precision, and elegance that few authors could match.  If one wants to learn how to explain complex concepts to a layman in an accessible but still sophisticated manner, the book will make a great tutorial. I don’t know enough about macroeconomics or finance to comment on many an opinion expressed in the book.   Truth told, the book taught me a lot about those subjects – the difference between equity and security being a memorable example. However, I do question the wisdom of writing a book about an event that had not even run its course at the writing (early 2009).   If Posner had waited a few more years, perhaps he would not insist to label the crisis as a “depression”.  He might also reconsider his derision of Feds’ low-interest policy because that policy, in a much more aggressive form, had not only survived Great Recession, but also thrived for more than a decade since.

Four Thousand Weeks

Oliver Burkeman’s Four Thousand Weeks deals with a very old question: how to best spend our limited time, roughly four thousand weeks in a lifetime (hence the book’s title)? In case you are wondering, this is not “yet another” book about time management. Burkeman will tell you that he hates time management coaches, and in fact his thesis is exactly the opposite: that you should literally stop trying to make best of your time – which reminds me the infamous Chinese internet meme: lying flat (躺平).  While he thus approaches the question from a deeply philosophical perspective, Burkeman wrote the book in a highly accessible – one may even say a little casual – way.   The book contains many fascinating ideas and ingenious insights about our relationship with time; some of which I have pondered about myself; some of which I have vaguely felt but never set my mind on; some of which are completely new to me.

The first insight of the book concerns our endless quest for greater efficiency.  Productivity, according to Burkeman, is a trap, because eventually you will become the victim of your own efficiency.  The faster you respond to emails, the more emails are drawn into your mailbox; the sooner you submit your journal reviews, the earlier the editor is ready to send you the next invitation; the better you become at work, the busier you might feel.  As Burkeman quipped, “your boss isn’t stupid, why would she give the extra work to someone slower”?  Seen from this light, the attempt to get everything under control – or keep your desk clean, to use a familiar metaphor – by using our limited time more productively is doomed to fail.

Burkeman argues our self-defeating obsession with efficient time management comes from the anxiety about our own finitude.  It is almost certain that, before you die, you could only see a small part of the world, acquire a tiny portion of the knowledge our species had accumulated, and get much less done than you or your parents once dreamed you might.   Because our time is so limited, tough choices are inevitable. Yet, most people refuse to face this inevitability, and instead invent strategies that help them look the other way.  They convince themselves that the real culprit is suboptimal time management, that they can always accomplish more – even all that they had ever wanted – if only they push themselves harder and find the perfect work-life balance. This, of course, is an illusion, because no optimization can possibly enable you to make time for everything you legitimately like or want.   Thus, the modern lifestyle of super-busyness is often an excuse, or a delay tactic, that numbs oneself emotionally so that they don’t have to feel the powerlessness in controlling their time, or saying no to things they hate to give up.  As Nietzsche once said,

“haste is universal because everyone is in flight from himself!”

How do we confront our finitude then? Burkeman offered three suggestions.

First, be patient.  To be patient first means to accept our life will always be full of problems, many of which are unpredictable and might visit in what appear to be the “worst time”.  The truth is, the day on which your life finally becomes problem-free will never come, because life is nothing but a series of problem-solving episodes.   A life without problems is not worth living, the same way as a novel without plots not worth reading.  To be patient also means, in Burkeman’s words, “to embrace radical incrementalism”.  The idea is that you should divide your problems into pieces and conquer them piece by piece, with the understanding that each piece may only bring you a relatively small step closer to the point of completion.  More often than not, seeking to finish your enemy off in a “decisive battle” – a concept cherished by Imperial Japanese Navy – is not a winning strategy, but rather an indicator of impatience, anxiety and weakness.

Second, be humble.  To be humble means to have a realistic expectation of the likelihood you can – to quote Steve Jobs – “put a dent in the universe”.  The common wisdom usually suggests you have nothing to lose by setting overly ambitious goals, provided you are committed to following through.  The spirit is best summarized in the words of Theodore Roosevelt, “keep your eyes on the stars, and your feet on the ground.”  However, Burkeman thinks this mentality of aiming-high-to-miss-is-better-than-aiming-low-to-hit tends to make you overvalue your existence, giving rise to an undue sense of urgency to spend your finite time well.   If the goal is to change the world, your life should not only “transcend the common and the mundane” but also have a lasting impact on humanity.  However, how many of us could ever make an impact of that proportion?  Even Jobs, Burkman argued (and I agree), would fail to pass that mark in the grand scheme of things.  Indeed, if one is to take a cosmologic perspective, even humanity as a whole may fail to put a dent in the universe.  As Katie Mack explained in her book “The End of Everything”, any marks left by humanity will be irreversibly erased by the final Death of the universe, and “at some point, in a cosmic sense, it will not have mattered that we ever lived”.   Younger people may dismiss Burkeman’s thesis as pessimism and defeatism, perhaps an indication that the prime of the man’s life is behind him.  However, as a middle-aged man in my forties, I think Burkeman was merely suggesting a change of perspective, from your own vintage point – to which harmful self-importance seems natural and understandable, to the perspective of others, to whom what you are doing with your life matter little, if at all.

Third, be time, not use it.  That we don’t “have”, but “are”, a limited amount of time is a concept that I’ve never conceived, perhaps never will on my own.  This was the idea of Martin Heidegger, the German philosopher whose reputation was tarnished by his close association with the Third Reich.  “To be, for a human”, Burkeman writes of Heidegger’s insight of “being-toward-death”, “is above all to exist temporally, in the stretch between birth and death, certain that the end will come, yet unable to know when.”  This understanding of our relationship with time is radically different from the conventional wisdom, which insists the present is something that we must instrumentalize for a great future gain, whose promise is as boundless as it is ambiguous – in fact so ambiguous one often has difficulty to articulate it when asked.    The laser focus on the future means, Burkeman observes, you end up living in it mentally, “locating the ‘real’ value of your life at some time that you haven’t yet reached, and never will.”    However, life is a succession of present moments and, since you may never know which one is your last, the moment of truth is always now.   Therefore, we must not view the present as a dress rehearsal for something greater to come, because the present moment is part of you – not merely a resource to be exploited by you.

Curiously, it seems the trace of Heidegger’s idea of “being-toward-death” can be found in the Japanese culture. In Rising Sun, John Tolane writes (the emphasis is mine),

“This strong recognition of death gave the Japanese not only the strength to face disaster stoically but an intense appreciation of each moment, which could be the last. This was not pessimism but a calm determination to let nothing discourage or disappoint or elate, to accept the inevitable.”

The phrase “sayonara”, commonly translated as goodbye, literally means “so be it” in Japanese. To the Japanese, “life was sayonara”, and they say sayonara to appreciate the present moment, as well as to accentuate its mortality.  Where did the Japanese get this idea? I would guess Buddhism.  But the answer to this question is obviously beyond me, for now.

Marco Van Basten

My interest in football was largely inspired by Marco van Basten’s legendary show in Euro88. Strangely, I never watched a single game of that tournament—I only learned about his glorious triumph from a cousin, almost a year later—but that did not stop me from eagerly falling into his fandom, the only one I ever joined.  With the benefit of hindsight, this feels like a perfect example of irrationality in human emotions, as epitomized in Stephen Chow’s famous inquiry, “Don’t I need a reason to fall in love with someone, do I, don’t I, do I,…”

Unfortunately, my only hero was also a tragic one.  Like Achilles, van Basten has a foot problem.  During the time that I fanatically follow him, van Basten lost the World Cup and the Euro Cup in a row; in the latter he missed a penalty kick that doomed his team.  In late 1992, he ordered a surgery on his right ankle, which ended his career right at its peak; he was only 28 and just named the FIFA player of the year.  I was deeply saddened by the departure of van Basten, so much so that my interest in football had never fully recovered from that sense of loss, unfairness, and tragedy.

I cannot say van Basten’s memoir is an interesting read for everyone. The book is written in a somewhat informal style, perhaps designed to create an impression of authenticity and intimacy, but it does at times hurt the coherence and clarity in storytelling.   If you were a fan, however, you may enjoy many of the personal stories: the immense suffering he endured from the ankle, the intriguing tax fraud case, the fond recollections of Berlusconi (his boss at AC Milan and the notorious prime minister of Italy), and the mediocre coaching career.  To tell the truth, I wasn’t sure about the idea of reading the memoir of someone whom I have idolized for so long, perhaps to evade the inevitable revelation that my tragic hero was a mirage, after all. In the end, that was exactly what I had discovered, but the experience was more fun than I thought: no regret or disappointment, just closing a chapter in life with a bit of nostalgia and relief.