Category Archives: English writing

On Liberty

I have heard and read about John Stuart Mill many times before but have never read him. Known as the most influential English-speaking philosopher of the nineteenth century, he still has many followers and admirers in the new millennium, even in some intellectual circles in mainland China. For example, Xiang Luo (罗翔) – the famed Chinese law professor who had gained an incredibly strong following on Internet because of his lucid and witty analysis of contemporary legal matters – is evidently a Mill’s fan.   I was reluctant to read Mill, or for that matter any philosophers who lived two centuries ago, as I wasn’t sure I could understand, much less enjoy, their writings.  However, after reading a blog by Luo that passionately praises On Liberty, I decided to at least give it a try.  I’m glad I did.

One of the most important works on political philosophy, On Liberty explains what constitutes liberty, why society must guarantee it, and how to resolve the conflict between liberty and order. Mill’s central argument is that a civilized community should not exercise power over its members against their will except for the purpose of preventing harm to others.  In his own words,

“The only freedom which deserves the name, is that of pursuing our own good in our own way, so long as we do not attempt to deprive others of theirs, or impede their efforts to obtain it”.

This doctrine, known as the harm principle, bestows each person a virtual sphere, whose boundary may be described by the adage, “my right to swing my fist ends where your nose begins”.  The individual is sovereign over themselves within this sphere, which Mill divides into three compartments: (i) the liberty of conscience, including thought, feeling, opinion and sentiment on all subjects, (ii) the liberty of planning one’s own life according to one’s tastes and character; and (iii) the liberty of uniting with other consenting individuals.

Per the harm principle, the US government seems to overstep its authority by outlawing prostitution, gambling, and drug use.   The government may consider these activities immoral and dangerous, even decidedly harmful to a person who engages in them, but still the person should only be warned of the danger, not forbidden from exposing themselves to it.  In fact, Mill thinks even commercializing such activities – say working as a pimp or selling drugs for a profit – may fall into the realm of individual liberty, so long as those activities themselves are admissible (under the harm principle, they surely are).

It should be noted that harm is a necessary but not a sufficient condition for interference.  Any competition for a scarce resource – admission to Ivy League colleges, election to political offices, tickets to Taylor Swift’s concert, to name a few – necessarily produces winners reaping benefits at the expense of losers. Do the winners thus harm losers, materially and/or psychologically?  Mill asserts such a claim would be valid only if the winner has employed “fraud or treachery, and force”.   Nor harm to others must be caused by actions.  A person can be held accountable for the harm attributed to their inaction, too, though compulsion against such offense must be more carefully exercised.  A somewhat surprising example given by Mill is parents failing to provide their children with the “ordinary chance of a desirable existence”. That is, the failure at parenting is not just a family tragedy, but a crime against the children and society. In fact, Mill has gone so far as suggesting couples who cannot show they have the means of raising children properly should be denied the right to marriage, effectively denying them the liberty to unite with others.

Mill would probably be called a free speech absolutist if he lived today. Expression of any opinion by any fringe group, in his mind, must be tolerated and protected, no questions asked.  To drive home this point, he writes,

“If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.”

Mill does not believe being offended by another person’s conduct or speech is an injury that warrants redress.  To him, the feeling of a person for their own opinion carries much more weight than the feeling of another who finds their holding it hurtful or offensive. If hate speech was a thing back then, Mill would be inclined to protect it too. He would be dumbfounded upon learning that Larry Summers, the former president of Harvard University, was forced to resign simply because he offered a seemingly innocent explanation of women’s underrepresentation in science and engineering.  The only qualification to the freedom of speech Mill would agree is that it must not incite violence.  For example, “an opinion that corn-dealers are starvers of the poor… may justly incur punishment when delivered orally to an excited mob assembled before the house of a corn-dealer”.   This example seems to fit well with the speech that Donald Trump gave to the mob that gathered in front of White House on January 6th, 2021 –– whether Trump was intended to stop a proceeding of US congress by force or not, the mere presence of a mob that could heed his words means the speech has violated the harm principle.

Mill had more than a healthy dose of skepticism about democracy.  He appears to suggest self-government is an illusion because there is no such a thing as “the government of each by himself”, but only the government “of each by all the rest”. The will of the people spoken of, similarly, is the will of the majority, not the will of everyone.  Mill is wary of society hindering the development of individuality by compelling its members to adopt its own ideas and practices as the rule of conduct.  This tyranny of the majority, he contends, can be more oppressive than an actual tyrant, because “it leaves fewer means of escape, penetrating much more deeply into the details of life, and enslaving the soul itself.”

At times Mill sounds like a staunch elitist.  Deriding the mass as “collective mediocrity”, he warns us the danger of allowing the mass to take their opinions from “men much like themselves, addressing them or speaking in their name, on the spur of the moment, through the newspapers”.  Instead, to rise above mediocrity, the mass must be “guided by the counsels and influence of a more highly gifted and instructed One or Few”.  Exactly who these geniuses are Mill did not specify.  I don’t think he meant elected officials, since no elected official in a democracy, including the president of the US, could ever hope to achieve this level of potency.

Liberty is not a natural right, according to Mill. He made it clear the people who are incapable “free and equal discussions” have no use for it. These “barbarians”, as Mill calls them, should consider themselves lucky if they can find a competent despot – “an Akbar or a Charlemagne” – to be their ruler.  Instead, Mill justifies liberty by its utility. Freedom of speech is indispensable because it guarantees “the opportunity of exchanging error for truth”.   Even if an opinion is wrong, we would gain, by giving it a fair hearing, a better understanding of truth “produced by its collision with error”.   As Mill puts it eloquently,

“he who knows only his own side of the case, knows little of that.”

Moreover, liberty fosters individuality, which is instrumental to human progress.  A civilization becomes stationary, Mill asserts, the moment it ceases to possess individuality. He argues the emphasis on conformity at the expense of individuality is the main reason why China fell so behind the West at the time of his writing (twenty years after the first Opium War).   China enjoyed “a particularly good set of customs” from early on, thanks to the talent and wisdom of a few “sages and philosophers”.  Yet, her attempt to “impress the best wisdom upon every mind in the community” backfired because it ended up imposing the same maxims and rules on everyone’s thoughts and conduct, thereby eradicating individuality.  Remarkably, Mill’s analysis still rings true in today’s China.  Growing up in 1970s and 1980s, I remember being taught that the best I can do for the nation is to become a “revolutionary screw” (革命的螺丝钉).   The word “revolutionary” might have been slowly phased out since then, but the metaphor has not. China still sees her citizens as standard parts on a well-oiled machine: indistinguishable and insignificant as individuals, but harmonious and powerful put together – or so she hopes.  In the past two centuries, China had tried to reinvent herself but insisted to do it her own way for so many times that Albert Einstein might think she was insane, as in “doing the same thing over and over and expecting different results”.  Will she succeed this time around?  I don’t know, but I will leave you with one of my favorite quotes from On Liberty (the emphasis is mine):

“A State which dwarfs its men, in order that they may be more docile instruments in its hands even for beneficial purposes, will find that with small men no great thing can really be accomplished”.

World order

World Order is about the philosophy of international relations.  Kissinger argues that any stable system of world order needs both legitimacy, which is a belief about what constitutes a just order, and power, which is what holds the order together to keep peace.  In this view, power and legitimacy are interdependent: power is unsustainable without legitimacy, and legitimacy cannot maintain order without power.   The key is how to strike the right balance. Using this theoretical framework, Kissinger analyzes how the power-legitimacy equilibrium played out in four systems of historic world order.

The bedrock of world order before 1945 was the so-called “Westphalian system”, named after the Treaties of Westphalia, which ended the Thirty Years War in 1648.   The war was largely fought to settle the legitimacy of Church’s monopoly over individuals’ spiritual relationship with God, and yet, its sheer destruction had convinced Europeans to never again center world order on moral authority.  Instead, the focus was shifted entirely to the allocation and balance of power with value-neutral rules, such as mutual respect for the sovereignty of states and noninterference in domestic affairs of other states. It goes without saying that these rules only apply to the states wielding enough power to tilt the order off balance.

If the Westphalian system is all about power, the Islamic order is all about legitimacy.  Islam divides the world into the land of believers and the land of infidels.  Islamists consider themselves permanently and automatically at war with the world inhabited by unbelievers, and Jihad—the mission of expanding Islam faith through struggles—the only way to bring peace to all humanity.  They reject any other form of legitimacy because only Islam can offer the true form of freedom, the “freedom from governance by other men and man-made doctrines”. This feverish commitment to religious imperatives inevitably denies the reality of power dynamics, often with grave consequences. Kissinger noted how it has, for example, “turned coexistence with Israel from an acceptance of reality” into an irreconcilable conflict with their own legitimacy for many Arab governments.

Like Islamism, Confucianism refuses to recognize any sovereigns as legitimate unless they are subordinate to the Chinese emperor, who supposedly rules everything “beneath the sky” with the Mandate of Heaven.   There are two important differences, however. First, the Mandate of Heaven is not sanctioned by God, but hinges on the ruler’s willingness and ability to provide good material life to the ruled. Second, China seeks respect, not conversion by force.  Instead, the “barbarians” are given a rung on her ladder of tributary, according to proximity to Chinese culture. Therefore, as Kissinger observed, there is no need “to order a world it considered already ordered, or best ordered by the cultivation of morality internally”.    To a certain extent, the current regime in China still sees the world the same way: it claims legitimacy from ever-increasing standard of living for its people, and it seeks to dominate not necessarily by physical force but by its achievements and conduct.  On paper, China has adopted the Westphalian system since 1949, as evidenced by her commitment to the five principles of peaceful co-existence.   That, however, is a practical accommodation to reality, not a reflection of Chinese ideal.  Chairman Xi’s vision of China Dream, vague as it may sound to a foreigner, precisely expresses a national nostalgia for that glorious past, real, and imagined, in which Chinese can pretend the world orbits around them for eternity.  That said, I think the threat of that vision to world peace has always been exaggerated in the West.  The image of an expansionist and missionary China is largely a mirage created from—depending on your propensity for cynicism—either a misunderstanding of or a disagreement with her preferred form of world order.

It would surprise no one that Kissinger thinks Americanism is our best shot at creating an optimal world order, though he made it clear there is still room for improvement.  As the cliché goes, America started with an idea.   That idea, I think, is as much about liberty and democracy, as about the American insight of world order.  Americans like to think they always place “principles” before “selfish interests” when it comes to world affairs. They are not only exceptional in this regard, but also destined to bring the vision to humanity.  As Thomas Jefferson put it, “it is impossible not to be sensible that we are acting for all mankind”.   Until Woodrow Wilson, however, America refrained from imposing her order on others. Instead, she contented herself with an exemplary role, as “the shining city on a hill”.  Ronald Reagan loved to talk about the shinning city, and his depiction of it is simply too good to pass over:

“…in my mind, it was a tall proud city built on rocks stronger than oceans, wind swept, God blessed, and teeming with people of all kinds living in harmony and peace—a city with free ports that hummed with commerce and creativity, and if there had to be city walls, the walls had doors, and the doors were open to anyone with the will and the heart to get here. That’s how I saw it, and see it still.”

To the extent this metaphor advocates leading by example rather than conquest, it bears a resemblance to how China sees her role in the world.  It was under Wilson’s watch that America began to embark on the mission to remake the world in her own image.   To Wilson, democracy was the source of legitimacy because it is both the best form of governance and the sole guarantee for permanent peace.  Thus, only by spreading democracy far and wide can humanity hope to resolve conflicts, achieve the equality of all nations, and maintain world peace and universal harmony.  This vision, Ironically, is not that different from Islamism, in terms of the end goal (world peace), the claim to an absolute moral truth and the pledge to convert “unbelievers”.  To be sure, America does not openly threaten to wage wars against unbelievers, opting instead to pressure tactics and sabotage campaigns.  Yet, she frequently found herself at war with them, not always supported by an airtight casus belli fully consistent with her “principles”.  Therefore, while in theory America dismisses any calculations of the Westphalian style balance of power as immoral and dangerous, in practice she always reserves for herself the right to embrace such a calculation on an ad hoc basis. Kissinger apparently thinks this ambivalence is a feature, not a bug, of Americanism, as he writes,

“America’s moral aspirations need to be combined with an approach that takes into account the strategic element of policy in terms the American people can support and sustain through multiple political cycles.”

In other words, the art of practicing Americanism is to find that delicate balance between power and legitimacy, which is probably best illustrated in the famous (or infamous) American doctrine of strategic ambiguity on defending Taiwan.   The danger, however, is that Americanism can be seen as opportunistic, if not hypocritical.  The lack of transparency and consistency has and will continue to enable America’s enemies to argue that she is, after all, no better than the value-neutral, power-centric imperialism that she purports to displace, and that her professed love for human rights, democracy and peace is but national interests under a fancy new dress.

If you are into geopolitics, you may find this book a real treat.  In essence it is a condensed world history, viewed through the lens of world order and filled with interesting details, anecdotes, and quotes that I truly enjoyed. Kissinger had turned 90 when the book was published in 2014, but he remained a cool-headed, clear-eyed, and elegant writer.   Perhaps more importantly, he was still the passionate believer and defender of Americanism, who refused to say anything negative at all about any of the twelve postwar presidents of the United States.  This lack of self-reflection is somewhat disappointing but understandable given Kissinger was far from an impartial analyst of America’s world order.

2022 Mid-term

Nearly every pundit and journalist I heard yesterday were shocked by the no-show of the Red Tsunami they had confidently prophesized.  Today, many of them seemed to have regained confidence in their own political acumen by explaining the failure with a new theory:  obviously (with the benefit of hindsight), the omnipresence of Trump had caused GOP to underperform.  This reminds me how good humans are at inventing theories to explain things – theorizing really seems easy and natural for us.   The tragedy is that we often fool ourselves into believing the ability to explain must also give us the ability to predict. The greater tragedy is many genuinely believe in these predictions and, even worse, are committed to bringing them about.

Capital in the 21st century

“Capital in the 21st Century” explains how the distribution of income and wealth (or capital) evolves according to the laws that govern economic growth, rate of saving, and returns on capital.  Picketty argues quite convincingly the reproduction of capital tends to outpace that of the economic output (GDP).   Once set in motion, therefore, capitalism inevitably concentrates wealth, creating a self-reinforcing spiral that ends with appallingly unequal distribution of wealth.  By Picketty’s estimation, the accumulation of wealth in the developed countries has by 2010s returned to a level that the world has not seen since the eve of the World War I.  As a shocking symptom of this extreme inequality, the bottom 50% of population collectively own close to nothing everywhere, including Sweden!  If this process is to continue indefinitely, he warns, “the past will devour the future” and we will return to a society of “rentiers dominant over those who own nothing but their labor”.

It is tempting to reject out of hand Picketty’s thesis as Marxism dressed in a new costume, and his laser focus on inequality a dangerous rhetoric tacitly inciting class warfare.  Such an interpretation would be unfair, however.  His main argument against capital concentration, I think, is not a moral one.  Rather, the concern is concentrating so much capital in so few hands may be socially destabilizing. Moreover, as the stock of capital continues to grow relative to GDP, the returns on capital may eventually converge to the rate of economic growth, at which point capitalists must reinvest all income from capital in order to merely preserve “their social status relative to the average for the society”. This last point, known as the golden rule, seems to me an ultimate manifestation of involution (内卷).

How do we avoid such apocalypse then?  Picketty’s innovation is a tax directly levied on capital, including all financial assets such as (unsold) stocks and bonds.   As utopian as it might sound, the idea has gained traction in mainstream politics lately.  The Democratic Party in the US, for example, recently proposed a “billionaire tax” to pay for Biden’s ambitious social spending programs.  Conceptually, the billionaire tax is exactly a tax on capital, though it limits the taxation base literally to “billionaires”. While this proposal died quickly, I suspect similar attempts will resurface in the future, if only as a novel revenue source for desperate governments.

Picketty is an articulate and persuasive writer, and “Capital” is absolutely worth reading.  As a side note, I found his open refusal to recognize economics as a science remarkable and laudable.  My jaw almost dropped when I read “the discipline of economics has yet to get over its childish passion for mathematics”. With that kind of candidness, I am sure the book did not win Picketty many friends in his profession; but whether you agree with him or not, you must admit to “tell it as it is” in such a dramatic fashion requires conviction, courage, and integrity.

From SPQR to one-man rule

Mary Beard’s SPQR—which stands for “Senate and People of Rome”—covers the first thousand years of the Roman Empire, running from the legendary founding of the city in 753 BCE to 212 CE, when Caracalla extended citizenship to all free men living within the Empire.  Her narrative anchors at the time of Caesar and Cicero (i.e., the middle of the first century BCE), which not only saw Rome’s transition from a republic to one-man rule, but also produced a significant body of literature, including a huge volume of Cicero’s writing. Coincidentally, this period largely overlaps with the glorious thousand years of Ancient China, between the Spring and Autumn Period, officially commenced in 770 BCE, to the end of the Eastern Han Dynasty (220 CE).

You would be thoroughly disappointed if you look forward to reading the colorful stories of the famed Roman tyrants or the virtuous deeds of the five good emperors.  Beard refused to reconstruct Roman history in terms of the biographies of the rulers.
She is skeptical of the accuracy of their “standard images” passed on to us in historical records. More importantly, she does not believe “the qualities of the man on the throne” would make much difference, because all emperors, from Nero to Marcus Aurelius, ruled according to the same blueprint laid out by Augustus.  Her sentiment reminds me of an Afghan proverb I recently came across,

“Better a strong dog in the yard than a strong king in the capital”.  

Accordingly, Beard’s portrait of Rome focuses on ordinary Romans.  She depicts with vivid details the Roman way of life, from where Romans live, what they eat, to how they commemorate the dead; she describes every facet of the society, from politics, entertainment and personal finance to law enforcement and war.  Beard’s stories are always carefully backed up not only by the writings of contemporary Romans, but also by rich archeological records – many of which I’ve never known exist.    I very much appreciate her deemphasizing royal résumés and court intrigues. However, I’m not sure all emperors are as useless or harmless as she insists. It may be true that emperors had limited influences on the daily life of any ordinary peasant or aristocrat.   However, overly ambitious despots or utterly incompetent idiots could still, without great labor, throw their empires into cataclysm and destroy millions of lives in the wake. This is especially true for many Chinese dynasties, where an ever-present, sophisticated, and layered bureaucratic system could impose laws and extract resources in nearly every corner of the empire.

Beard seems to agree with Polybius—a Greek who wrote in the second century BCE a 40-volume book entitled “Histories”—that Rome’s rapid ascent to hegemony should be credited to the idea of checks-and-balances embedded in her political system.  Seeking to maintain a delicate equilibrium between consuls, the senate, and the people, the idea had influenced the United States’ constitution so much that it remains the emblem of her politics to this day.  However, I suspect Polybius had made a common mistake in social science here: extrapolating incomplete patterns into a specious theory. On the other side of Earth, the Kingdom of Qin established the first Chinese empire in 221 BCE, 75 years before Rome became the master of Mediterranean on the ruins of Carthage.  Qin was a highly centralized monarchy founded on legalism, a political philosophy antithetical to the idea of checks-and-balances. Legalists argue the more concentrated the power is into the hands of the sovereign, the better. They advised the emperor that the people are not to be entrusted with liberty or right to participatory governance; instead, they must be ruthlessly exploited for the collective national interest––whatever that means––and to save themselves from falling victim to their own vices (hence the slogan “serving the people”). Cruel as it might sound, legalism had enabled Qin to conquer a vast territory by force and remake China in its own image. To be sure, the mighty Qin dynasty lasted only 15 years.  However, the polity it pioneered had survived for millenniums – some may argue it continues to this day. Thus, checks-and-balances is probably not the secret behind Rome’s unparalleled success.  Nor had it saved Rome from the populist strong men of the first century BCE – the likes of Pompei, Caesar and Octavian.

Beard notes “Roman emperors and their advisors never solved the problem of succession”.  Rather than sticking to primogeniture, Roman rulers often resorted to— sometimes forced by biology, as in the case of Julio-Claudian dynasty—a form of ambiguous meritocracy for choosing their heir. As a result, for the period covered in the book, only three emperors, Vespasian, Marcus Aurelius and Septimius Severus, had passed the throne to their biological sons.  Those who have watched the Hollywood movie “Gladiator” may remember the scene where Marcus Aurelius was murdered by his son, Commodus, who found out the philosopher emperor was about to name an able and wise general as the heir to the throne. I have not seen much evidence supporting this dramatized version of the fateful succession that upended the era of “five-good-emperors”.  In fact, Commodus was named the co-emperor––another strange Roman invention––at the age of 15 by his father.   Nevertheless, the Hollywood story captures the Romans’ ideal succession principle, perhaps best expressed in a speech delivered to the emperor Trajan by Pliny the Younger,

“If he is to rule over all, he must be chosen from all”.

To Pliny’s contemporaries in China––the elites of the Eastern Han Dynasty––the suggestion that an emperor should be chosen from all must sound absurd, if not blasphemous.  While the legend has it that once upon a time Chinese, too, chose their ruler by merit rather than birth, that nostalgic era of Yao-Shun-Yu (尧舜禹) had long gone by the time when Pliny wrote his speech.   The point, of course, was never about which succession principle is better, but rather no principle always works under one-man rule. As Beard points out, transferring the absolute power is an inherently unstable and dangerous business, and the moment when that power was supposedly handed on was “always the moment when the empire was most vulnerable.”  To this truth millions of people can still attest even today.

 

The End of Everything

I am always attracted to “The End of XXX”. “The End of Faith”, “The End of Time”, “The End of Physics”, “The End of History”, and the list goes on.   Thus, ever since a colleague of mine named The End of Everything his favorite book about astrophysics, I knew I must read it.   I was not disappointed.

The book is a layman’s guide to cosmology, with a focus on the death of the universe. Katie Mack explains that our universe could end in five different ways and she expects humanity to survive in none of these scenarios.  Of the five endings, Heat Death seems the most humane to me.  In it, the universe will continue to expand until it reaches a thermodynamic equilibrium, at which nothing, including life in any form as we know it, can ever happen again.  The other four endings, if I understand them correctly, all involve a cataclysm that, according to Mack, you will never want to live long enough to witness.

A book entitled “The end of everything”, of course, is inherently about eschatology.  Contemplating the end of the universe was surprisingly hard and strangely personal. In fact, I found it even harder than thinking of my own death. We humans often come to terms with death using the legacies we might leave behind: passing our genes on to next generations; making the world a better place; or better yet, enshrining our ideas in eternal knowledge.  However, if humanity itself will not survive the destruction of the universe, these justifications sound unconvincing.  “At some point, in a cosmic sense, it will not have mattered that we ever lived.” Mack tells us.  This comment reminds me of the famous quote from the movie Coco, “When there is no one left in the living world who remembers you, you disappear from this world. we call it the Final Death”. The end of universe is the Final Death of Humanity.

Mack then asks the obvious question: “What does that mean for humanity and where does that leave us now?”  In the epilogue, she tried but struggled to offer a satisfactory answer.  I could not come up with an answer either.  In fact, just thinking about it makes me feel sad. Indeed, when a colleague of Mack posed that question at an academic seminar, some people in the audience cried.

Mack is a great writer and communicator.  Her infectious passion for science and sharp wit makes reading the book a joy that I look forward to everyday. For the first time, I feel that I actually understand what dark energy or cosmic background radiation is.  Of course, I still have no idea about the Higgs Field or Vacuum Decay, but that’s probably on me.

A failure of capitalism

Richard Posner is said to be the most cited legal scholar of the twentieth century.  A Failure of Capitalism, a book published toward the end of his distinguished career, is not among the most cited of his scholarly works – not even close – but probably the most read, judged by the number of ratings on Goodreads.  “The failure” of capitalism concerned herein is the financial crisis of 2008, which triggered the Great Recession, the worst recession the world had ever seen since the legendary depression of 1929.  The book attempts to explain the causes of that crisis, who should bear the blames and what lessons we may collectively draw from the event.

Posner believes the main cause of Great Recession is the confluence of lower interest rates in 2000s and the over-deregulation of financial industry that began much earlier.  On the one hand, cheap credit encouraged the expansion of homeownership, which pushed up home price because the supply in real estate market is usually slow to catch up with demand.  Rising price convinced people that houses are good investment, thereby inducing more to dive in with money borrowed beyond their means to pay back unless home price continued to rise.  On the other hand, deregulation made it harder for traditional commercial banks to raise equity capital from demand deposit accounts, owing to the competition from investment banks and hedge funds alike. To stay in the game, therefore, banks had to rely more on borrowed short-term credit, increase their leverage (the ratio of debt to equity) and make longer term (hence riskier) loans. With the huge demand for credit fueled by the low interests of 2000s, this business model was pushed to the limit, exposing the entire industry to the risk of default in the housing market should the price begin to fall.  To mitigate these risks, banks invented complex debt securitization devices, including the infamous credit-default swaps.   In hindsight, however, these tools were not so much about reducing the risks as hiding them, unconsciously or otherwise.

Given this analysis, it is hardly surprising Posner argues the leaders at Federal Reserve and other economic agencies – Alan Greenspan and, to lesser extent, Ben Bernanke, among others – are culpable for a misconceived monetary policy and the lack of foresight for the impending crisis.  He also points fingers at the US government for its deregulation of the financial industry – driven largely by market fundamentalism – and the failure to prepare a contingence plan that would have avoided the “bumbling improvisations” in the initial response.   Posner is also dismayed at the “failure of the economics profession to have grasped the dangers”.  Many, including Robert Lucas – the most distinguished macroeconomist at the time – seemed to have been completely blindsided by the disaster.    Lucas had gone so far as to downplay the imminence of a recession as late as September 19, 2008, four days after the collapse of Lehman Brothers.     However, Posner argues his fellow academics deserve lenience for missing the warning signs that they were supposedly best poised to spot.  For one thing, they were not well equipped to “empirically test rival theories of depression” and were increasingly isolated in their own silos by ever-greater specialization. More importantly, doomsaying is a tricky and unpopular craft. As Posner points out, “Cassandras rarely receive a fair hearing”, because “it is very difficult to receive praise, and indeed to avoid criticism, for preventing a bad thing from happening unless the probability of its happening is known”.

Posner pushes back forcefully against the claim that the crisis had much to do with the stupidity or greed of bank executives and hedge fund managers.  Nor does he believe they should be held responsible for not heeding the warning signs of a gigantic bubble while taking seemingly undue risks to “ride it”.     “Riding a bubble can be rational”, Posner explains, especially when money is cheap.  More importantly, nobody really knows when a bubble, unattainably large as it might seem, will burst, and until it does one could still be making much money riding it than climbing off.   Indeed, being rational could be a losing proposition when the majority is irrational, as summarized in Keynes’ famous aphorism,

“the market can stay irrational longer than you can stay solvent”.

Posner believes the duty of mitigating systematic risks resides elsewhere (i.e., government), because

“it would make no more sense for an individual businessman to worry that because of the instability of the banking industry his decisions and those of his competitors might trigger a depression than for a lion to spare a zebra out of concern that lions are eating zebras faster than the zebras can reproduce.”

Posner writes beautifully, with a combination of clarity, precision, and elegance that few authors could match.  If one wants to learn how to explain complex concepts to a layman in an accessible but still sophisticated manner, the book will make a great tutorial. I don’t know enough about macroeconomics or finance to comment on many an opinion expressed in the book.   Truth told, the book taught me a lot about those subjects – the difference between equity and security being a memorable example. However, I do question the wisdom of writing a book about an event that had not even run its course at the writing (early 2009).   If Posner had waited a few more years, perhaps he would not insist to label the crisis as a “depression”.  He might also reconsider his derision of Feds’ low-interest policy because that policy, in a much more aggressive form, had not only survived Great Recession, but also thrived for more than a decade since.

Four Thousand Weeks

Oliver Burkeman’s Four Thousand Weeks deals with a very old question: how to best spend our limited time, roughly four thousand weeks in a lifetime (hence the book’s title)? In case you are wondering, this is not “yet another” book about time management. Burkeman will tell you that he hates time management coaches, and in fact his thesis is exactly the opposite: that you should literally stop trying to make best of your time – which reminds me the infamous Chinese internet meme: lying flat (躺平).  While he thus approaches the question from a deeply philosophical perspective, Burkeman wrote the book in a highly accessible – one may even say a little casual – way.   The book contains many fascinating ideas and ingenious insights about our relationship with time; some of which I have pondered about myself; some of which I have vaguely felt but never set my mind on; some of which are completely new to me.

The first insight of the book concerns our endless quest for greater efficiency.  Productivity, according to Burkeman, is a trap, because eventually you will become the victim of your own efficiency.  The faster you respond to emails, the more emails are drawn into your mailbox; the sooner you submit your journal reviews, the earlier the editor is ready to send you the next invitation; the better you become at work, the busier you might feel.  As Burkeman quipped, “your boss isn’t stupid, why would she give the extra work to someone slower”?  Seen from this light, the attempt to get everything under control – or keep your desk clean, to use a familiar metaphor – by using our limited time more productively is doomed to fail.

Burkeman argues our self-defeating obsession with efficient time management comes from the anxiety about our own finitude.  It is almost certain that, before you die, you could only see a small part of the world, acquire a tiny portion of the knowledge our species had accumulated, and get much less done than you or your parents once dreamed you might.   Because our time is so limited, tough choices are inevitable. Yet, most people refuse to face this inevitability, and instead invent strategies that help them look the other way.  They convince themselves that the real culprit is suboptimal time management, that they can always accomplish more – even all that they had ever wanted – if only they push themselves harder and find the perfect work-life balance. This, of course, is an illusion, because no optimization can possibly enable you to make time for everything you legitimately like or want.   Thus, the modern lifestyle of super-busyness is often an excuse, or a delay tactic, that numbs oneself emotionally so that they don’t have to feel the powerlessness in controlling their time, or saying no to things they hate to give up.  As Nietzsche once said,

“haste is universal because everyone is in flight from himself!”

How do we confront our finitude then? Burkeman offered three suggestions.

First, be patient.  To be patient first means to accept our life will always be full of problems, many of which are unpredictable and might visit in what appear to be the “worst time”.  The truth is, the day on which your life finally becomes problem-free will never come, because life is nothing but a series of problem-solving episodes.   A life without problems is not worth living, the same way as a novel without plots not worth reading.  To be patient also means, in Burkeman’s words, “to embrace radical incrementalism”.  The idea is that you should divide your problems into pieces and conquer them piece by piece, with the understanding that each piece may only bring you a relatively small step closer to the point of completion.  More often than not, seeking to finish your enemy off in a “decisive battle” – a concept cherished by Imperial Japanese Navy – is not a winning strategy, but rather an indicator of impatience, anxiety and weakness.

Second, be humble.  To be humble means to have a realistic expectation of the likelihood you can – to quote Steve Jobs – “put a dent in the universe”.  The common wisdom usually suggests you have nothing to lose by setting overly ambitious goals, provided you are committed to following through.  The spirit is best summarized in the words of Theodore Roosevelt, “keep your eyes on the stars, and your feet on the ground.”  However, Burkeman thinks this mentality of aiming-high-to-miss-is-better-than-aiming-low-to-hit tends to make you overvalue your existence, giving rise to an undue sense of urgency to spend your finite time well.   If the goal is to change the world, your life should not only “transcend the common and the mundane” but also have a lasting impact on humanity.  However, how many of us could ever make an impact of that proportion?  Even Jobs, Burkman argued (and I agree), would fail to pass that mark in the grand scheme of things.  Indeed, if one is to take a cosmologic perspective, even humanity as a whole may fail to put a dent in the universe.  As Katie Mack explained in her book “The End of Everything”, any marks left by humanity will be irreversibly erased by the final Death of the universe, and “at some point, in a cosmic sense, it will not have mattered that we ever lived”.   Younger people may dismiss Burkeman’s thesis as pessimism and defeatism, perhaps an indication that the prime of the man’s life is behind him.  However, as a middle-aged man in my forties, I think Burkeman was merely suggesting a change of perspective, from your own vintage point – to which harmful self-importance seems natural and understandable, to the perspective of others, to whom what you are doing with your life matter little, if at all.

Third, be time, not use it.  That we don’t “have”, but “are”, a limited amount of time is a concept that I’ve never conceived, perhaps never will on my own.  This was the idea of Martin Heidegger, the German philosopher whose reputation was tarnished by his close association with the Third Reich.  “To be, for a human”, Burkeman writes of Heidegger’s insight of “being-toward-death”, “is above all to exist temporally, in the stretch between birth and death, certain that the end will come, yet unable to know when.”  This understanding of our relationship with time is radically different from the conventional wisdom, which insists the present is something that we must instrumentalize for a great future gain, whose promise is as boundless as it is ambiguous – in fact so ambiguous one often has difficulty to articulate it when asked.    The laser focus on the future means, Burkeman observes, you end up living in it mentally, “locating the ‘real’ value of your life at some time that you haven’t yet reached, and never will.”    However, life is a succession of present moments and, since you may never know which one is your last, the moment of truth is always now.   Therefore, we must not view the present as a dress rehearsal for something greater to come, because the present moment is part of you – not merely a resource to be exploited by you.

Curiously, it seems the trace of Heidegger’s idea of “being-toward-death” can be found in the Japanese culture. In Rising Sun, John Tolane writes (the emphasis is mine),

“This strong recognition of death gave the Japanese not only the strength to face disaster stoically but an intense appreciation of each moment, which could be the last. This was not pessimism but a calm determination to let nothing discourage or disappoint or elate, to accept the inevitable.”

The phrase “sayonara”, commonly translated as goodbye, literally means “so be it” in Japanese. To the Japanese, “life was sayonara”, and they say sayonara to appreciate the present moment, as well as to accentuate its mortality.  Where did the Japanese get this idea? I would guess Buddhism.  But the answer to this question is obviously beyond me, for now.

Marco Van Basten

My interest in football was largely inspired by Marco van Basten’s legendary show in Euro88. Strangely, I never watched a single game of that tournament—I only learned about his glorious triumph from a cousin, almost a year later—but that did not stop me from eagerly falling into his fandom, the only one I ever joined.  With the benefit of hindsight, this feels like a perfect example of irrationality in human emotions, as epitomized in Stephen Chow’s famous inquiry, “Don’t I need a reason to fall in love with someone, do I, don’t I, do I,…”

Unfortunately, my only hero was also a tragic one.  Like Achilles, van Basten has a foot problem.  During the time that I fanatically follow him, van Basten lost the World Cup and the Euro Cup in a row; in the latter he missed a penalty kick that doomed his team.  In late 1992, he ordered a surgery on his right ankle, which ended his career right at its peak; he was only 28 and just named the FIFA player of the year.  I was deeply saddened by the departure of van Basten, so much so that my interest in football had never fully recovered from that sense of loss, unfairness, and tragedy.

I cannot say van Basten’s memoir is an interesting read for everyone. The book is written in a somewhat informal style, perhaps designed to create an impression of authenticity and intimacy, but it does at times hurt the coherence and clarity in storytelling.   If you were a fan, however, you may enjoy many of the personal stories: the immense suffering he endured from the ankle, the intriguing tax fraud case, the fond recollections of Berlusconi (his boss at AC Milan and the notorious prime minister of Italy), and the mediocre coaching career.  To tell the truth, I wasn’t sure about the idea of reading the memoir of someone whom I have idolized for so long, perhaps to evade the inevitable revelation that my tragic hero was a mirage, after all. In the end, that was exactly what I had discovered, but the experience was more fun than I thought: no regret or disappointment, just closing a chapter in life with a bit of nostalgia and relief.

The Sovereign Individual

I am intrigued by this book mainly because Peter Thiel said it “tremendously influenced” him.  I am no Thiel’s fan, but he is a student of both philosophy and entrepreneurship, a rare creature among intellectuals.   Anyway, my reaction is not as positive as his.  To be sure, the book’s main insight, that technology determines the returns to organized violence, which in turn shapes the structure of our political systems, is brilliant.  Its critiques of democracy are harsh, sometimes unfair, but not without merits.  Published in 1997, the book predicted, with amazing foresight, the rise of Bitcoin, the threat of cyber warfare, the destructive power of social media, and to the lesser extent, the election of Trump.

Having said that, I am also troubled by the excessive right-wing rhetoric, the covert racism, and above all the obsession with Social Darwinism.  The authors predict nation-states cannot survive the Information Revolution, just like medieval Church did not survive the Industrial Resolution 500 years ago.  The reason, according to the book, is Information Technology will undermine the ability of nation-states to collect taxes and to wage wars in cyberspace.  From the ruins of nation-states shall Sovereign Individuals rise.  These super humans, no longer pledging allegiance to any nation, will simply shop around in an open market of commercialized sovereignties.   Of course, the idea that everyone can and should buy protection service from mafia bosses and warlords who sell it at the lowest price is absurd and dangerous.   It also defies logic and history to suggest protection against violence would become more cost effective by simply unleashing competitions. What is most horrifying, however, is the book’s complete lack of concern for the “losers or leftbehinds” who are incapable of becoming Sovereign Individuals.  The authors thought these losers will fight tooth and nail to save their “license to steal” (from Sovereign Individuals) but shall not prevail.   Beyond that, their fate is unclear but ceases to be a concern.

In summary, reading this book is sort of like eating spicy crayfish (麻辣小龙虾).  If one is willing to peel off the nasty red shell, there are delicious treats awaiting,although some may find the reward unworthy of the effort.  Stay away if you are allergic to spicy food.