Creeping Toward Dystopia

May 25, 2023 ROBERT SKIDELSKY

Amid the growing excitement about generative AI, there are also mounting concerns about its potential contribution to the erosion of civil liberties. The convergence of state intelligence agencies and surveillance capitalism underscores the threat that artificial intelligence poses to the future of democracy.

LONDON – With investors pouring billions of dollars into artificial intelligence-related startups, the generative AI frenzy is beginning to look like a speculative bubble akin to the Dutch tulip mania of the 1630s and the South Sea Bubble of the early eighteenth century. And, much like those episodes, the AI boom appears headed for an inevitable bust. Instead of creating new assets, it threatens to leave behind only mountains of debt. 

Today’s AI hype is fueled by the belief that large language models like OpenAI’s newly released GPT-4 will be able to produce content that is virtually indistinguishable from output produced by humans. Investors are betting that advanced generative AI systems will effortlessly create text, music, images, and videos in any conceivable style in response to simple user prompts. 

Amid the growing enthusiasm for generative AI, however, there are mounting concerns about its potential impact on the labor market. A recent report by Goldman Sachs on the “potentially large” economic effects of AI estimates that as many as 300 million jobs are at risk of being automated, including many skilled and white-collar jobs. 

To be sure, many of the promises and perils linked to AI’s rise are still on the horizon. We have not yet managed to develop machines that possess the level of self-awareness and capacity for informed decision-making that aligns with most people’s understanding of intelligence. This is why many technologists advocate incorporating “moral rules” into AI systems before they surpass human capabilities. 

But the real danger is not that generative AI will become autonomous, as many tech leaders would have us believe, but rather that it will be used to undermine human autonomy. Both “narrow” and “general purpose” AI systems that can perform tasks more efficiently than humans represent a remarkable opportunity for governments and corporations seeking to exert greater control over human behavior. 

As Shoshana Zuboff notes in her 2019 book The Age of Surveillance Capitalism, the evolution of digital technologies could lead to the emergence of “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales.” The increasingly symbiotic relationship between government and private-sector surveillance, she observes, is partly the result of a national-security apparatus “galvanized by the attacks of 9/11” and intent on nurturing and appropriating emerging technologies to gain “total knowledge” of people’s behavior and personal lives.

Palantir, the data-analytics company co-founded by billionaire investor Peter Thiel, is a case in point. Thiel, a prominent Republican donor, reportedly persuaded former US President Donald Trump’s administration to grant Palantir lucrative contracts to develop AI systems tailored for military use. In exchange, Palantir provides intelligence services to the US government and other spy agencies around the world. 

In “A Voyage to Laputa,” the third part of Jonathan Swift’s Gulliver’s Travels, Captain Gulliver comes across a floating island inhabited by scientists and philosophers who have devised ingenious methods for detecting conspiracies. One of these methods involves scrutinizing the “diet of all suspected persons,” as well as closely examining “their excrements,” including “the color, the odor, the taste, the consistence, the crudeness or maturity of digestion.” While the modern state-surveillance apparatus focuses on probing emails rather than bodily functions, it has a similar objective: to uncover plots and conspiracies against “public order” or “national security” by penetrating the depths of people’s minds. 

But the extent to which governments can spy on their citizens depends not only on the available technologies but also on the checks and balances provided by the political system. That is why China, whose regulatory system is entirely focused on preserving the political stability and upholding “socialist values,” was able to establish the world’s most pervasive system of electronic state surveillance. It also helps explain why China is eager to position itself as a world leader in regulating generative AI. 

In contrast, the European Union’s approach to regulation is centered around fundamental human rights, such as the rights to personal dignity, privacy, freedom from discrimination, and freedom of expression. Its regulatory frameworks emphasize privacy, consumer protection, product safety, and content moderation. While the United States relies on competition to safeguard consumer interests, the EU’s AI Act, which is expected to be finalized later this year, explicitly prohibits the use of user-generated data for “social scoring.” 

The West’s “human-centered” approach to regulating AI, which emphasizes protecting individuals from harm, contrasts sharply with China’s authoritarian model. But there is a clear and present danger that the two will ultimately converge. This looming threat is driven by the inherent conflict between the West’s commitment to individual rights and its national-security imperatives, which tend to take precedence over civil liberties in times of heightened geopolitical tensions. The current version of the AI Act, for example, grants the European Commission the power to prohibit practices such as predictive policing, but with various exemptions for national-security, defense, and military uses. 

Amid the fierce competition for technological supremacy, governments’ ability to develop and deploy intrusive technologies poses a threat not just to companies and political regimes but to entire countries. This malign dynamic stands in stark contrast to optimistic predictions that AI will bring abouta “wide array of economic and societal benefits across the entire spectrum of industries and social activities.” 

Unfortunately, the gradual erosion of countervailing powers and constitutional limits on government action within Western liberal democracies plays into the hands of authoritarian regimes. As George Orwell presciently observed, a state of perpetual war, or even the illusion of it, creates an ideal setting for the emergence of a technological dystopia.

Can Governments Still Steer the Economy?

Mar 28, 2023 ROBERT SKIDELSKY

Inflation and growth rates are increasingly determined by global events over which national policymakers have no control. Instead of clinging to the illusion that they can control the uncontrollable, governments should use fiscal policy to protect their most vulnerable citizens from disruptive external shocks.

LONDON – In 1969, the British financial journalist Samuel Brittan published a book called Steering the Economy: The Role of the Treasury. At the time, it was still widely assumed that the United Kingdom’s economy was steerable and that the Treasury (which was still in charge of monetary policy) was at the helm.

Back then, the Treasury’s macroeconomic model, which calculated national income as the sum of consumption, investment, and government spending, effectively made the budget the regulator of economic performance. By varying its own spending and taxation, the Treasury could nudge the UK toward full employment, real GDP growth, and low inflation. Subsequent models, influenced by the monetarist and New Classical revolutions in economic theory, have since reduced the state’s capacity to intervene. Yet the belief that governments are responsible for economic performance still runs deep. 

The UK’s recent budget announcement is a case in point. When presenting his budget to Parliament this month, Chancellor of the Exchequer Jeremy Hunt sought to reassure lawmakers that the government is on track to tame inflation, reduce debt, and boost economic growth. Hunt even went so far as to present detailed predictions for each of the next five years. As he put it, “We are following the plan, and the plan is working.” Yet, it has long been clear that inflation and growth depend on global trends over which the British chancellor has no control. 

The fact is that international finance, technology, and geopolitics rule out any possibility of “steering” the UK economy. While these variables were regarded as stable (or at least predictable) parameters of national policymaking as late as the 1990s, today all three are considered a source of exogenous shocks – unpredictable or unexpected events – with the potential to spoil any budget forecasts. 

No UK policymaker, for example, predicted the global financial meltdown caused by the 2008 collapse of Lehman Brothers. Likewise, no one can foresee the repercussions of the recent failure of Silicon Valley Bank and Credit Suisse, especially in an era when every possible effect of every disruptive economic event is amplified on social media. And with heightened geopolitical tensions threatening global supply chains, the models on which policymakers like Hunt rely are becoming increasingly obsolete. 

Specifically, the relationship between fiscal and monetary policy is veiled in mystery. The reigning economic model assumes that controlling inflation is a necessary and sufficient condition for macroeconomic stability, and that inflation is primarily caused by budget deficits, or “governments printing too much money.” With that in mind, the government outsourced the task of steering the economy to the Bank of England in 1997, while the Treasury remained in charge of balancing the budget over a five-year forecast period and reducing net debt to a sustainable level.

The combination of BOE independence and fiscal discipline was supposed to assure markets that politicians would not go on spending sprees. But, given that the BOE has been printing as much money as politicians want since the start of the COVID-19 pandemic, the separation between fiscal and monetary policy has become largely fictional, along with the stability and prosperity it was said to ensure. 

Hunt should have looked to US President Joe Biden for more creative economic thinking. Biden’s Inflation Reduction Act, which includes $370 billion in clean-energy subsidies, is based on an almost-forgotten macroeconomic idea known as the balanced-budget multiplier: higher public spending can be paid for by raising taxes on the rich. Biden’s stated policy is still to balance the budget, but this approach would enable him to do so while boosting spending, rather than adopting the sort of austerity policies UK governments continue to pursue. 

Biden’s economic policy represents a welcome return to the old Keynesian view that aggregate demand matters. By contrast, Hunt’s plan to boost economic growth depends entirely on remedying so-called structural (or supply-side) deficiencies. 

The UK’s puzzling labor shortage underscores the inadequacy of the British government’s approach. The number of unemployed people is 1.3 million, and millions more working-age Britons are not employed or actively seeking work. Yet many businesses are struggling to find workers, with job vacancies jumping to 1.1 million. Hunt’s answer is to increase incentives for the “economically inactive” to rejoin the labor market. But in practice, he is encouraging people to apply for jobs that do not exist. 

The reason is that despite supply bottlenecks in sectors such as retail, hospitality, and agriculture, the economy as a whole is experiencing a deficiency of aggregate demand. Given that the British economy has still not recovered to its 2019 level, and that consumption has fallen while the population has grown by 1.7 million between 2020 and 2023, this should not come as a surprise. Yet, the government’s latest budget makes no mention of boosting aggregate demand for labor, either on the consumption or the investment side. 

In late 2020, former UK Prime Minister Gordon Brown and I proposed a scheme whereby the government would guarantee a job and/or training to anyone who could not find work in the private sector, at a fixed hourly rate not lower than the national minimum wage. This, we argued, would be the quickest way to boost aggregate consumption in the economy without resorting to complicated forecasts about the size of the output gap. As John Maynard Keynes once said, “Look after unemployment, and the budget will look after itself.” 

On the investment side, Hunt announced the creation of 12 investment zones free from the burdensome regulations that supposedly stifle entrepreneurs’ “animal spirits.” In concentrating on these supply-side measures, however, Hunt has missed an opportunity to beef up two nascent publicly-financed investment institutions: the UK Infrastructure Bank, which was set up in June 2021 to provide finance for projects to tackle climate change and support local economic growth, and the British Business Bank, created in 2014 to fill the financing gap for small businesses. By enhancing public investment, the government could improve business expectations and divert investment from speculation to critical green-energy projects and regional development. 

At a time of global turmoil and heightened uncertainty, the national budget’s primary purpose is not to steer the economy to the point of imagined stability. Rather, policymakers must use fiscal policy to protect the least well-off from disruptive external blows and to achieve maximum strategic autonomy in a world that is spinning out of control.

Globalization’s Latest Last Stand

With the world increasingly turning away from economic integration and cooperation, the second wave of globalization is threatening to give way to fragmentation and conflict, as the first wave did in 1914. Averting catastrophe requires developing strong political foundations capable of sustaining a stable international order.

LONDON – Is the world economy globalizing or deglobalizing? The answer would have seemed obvious in 1990. Communism had just collapsed in Central and Eastern Europe. In China, Deng Xiaoping was unleashing capitalist enterprise. And political scientist Francis Fukuyama famously proclaimed the “end of history,” by which he meant the triumph of liberal democracy and free markets.

Years earlier, the British economist Lionel Robbins, a firm believer in free markets, warned that the shaky political foundations of the postwar international order could not support a globalized economy. But in the euphoria and triumphalism of the early 1990s, such warnings fell on deaf ears. This was, after all, a “unipolar moment,” and American hegemony was the closest thing to a world government. With the Soviet Union vanquished, the thinking went, the last political barrier to international economic integration had been removed. 

Dazzled by abstractions, economists and political scientists should have paid more attention to history. Globalization, they would have learned, tends to come in waves, which then recede. The first wave of globalization, which took place between 1880 and 1914, was enabled by a huge reduction in transport and communication costs. By 1913, commodity markets were more integrated than ever, the gold standard maintained fixed exchange rates, and capital – protected by empires – flowed freely and with little risk. 

Alas, this golden age of liberalism and economic integration gave way to two world wars, separated by the Great Depression. Trade shrank to 1800 levels, capital flows dried up, governments imposed tariffs and capital controls to protect industry and employment, and the largest economies separated into regional blocs. Germany, Japan, and Italy went to war to establish blocs of their own. 

The second wave of globalization, which began in the 1980s and accelerated following the end of the Cold War and the rise of digital communications, is now rapidly retreating. The global trade-to-GDP ratio fell from a peak of 61% just before the 2008 financial crisis to 52% in 2020, and capital movements have been increasingly restricted in recent years. As the United States and China lead the formation of separate geopolitical blocs, and the world economy gradually shifts from interconnectedness to fragmentation, deglobalization seems well underway. 

To understand why globalization has broken down for a second time, it is worth revisiting John Maynard Keynes’s memorable description of London on the eve of World War I. “The projects and politics of militarism and imperialism, of racial and cultural rivalries, of monopolies, restrictions, and exclusion, which were to play the serpent to this paradise,” he wrote in 1919, “were little more than the amusements of [the investor’s and consumer’s] daily newspaper, and appeared to exercise almost no influence at all on the ordinary course of social and economic life, the internationalization of which was nearly complete in practice.”

In our own time, geopolitics is once again threatening to break the international order. Commerce, as Montesquieu observed, has a pacifying effect. But free trade requires strong political foundations capable of soothing geopolitical tensions; otherwise, as Robbins warned, globalization becomes a zero-sum game. In retrospect, the failure to make the United Nations Security Council truly representative of the world’s population might have been the original sin that led to the current backlash against economic openness. 

But geopolitics is not the only reason for the breakdown of globalization’s second wave. Neoliberal economics, which came to dominate policymaking in the 1980s, has fueled global instability in three major ways. 

First, neoliberals fail to account for uncertainty. The efficient-market hypothesis – the belief that financial markets price risks correctly on average – provided an intellectual basis for deregulation and blinded policymakers to the dangers of setting finance free. In the run-up to the 2008 crisis, experts and multilateral institutions, including the International Monetary Fund, were still claiming that the banking system was safe and that markets were self-regulating. While that sounds ridiculous in retrospect, similar views still lead banks to underprice economic risks today. 

Second, neoliberal economists have been oblivious to global imbalances. The pursuit of market-led economic integration accelerated the transfer of manufacturing production from developed economies to developing economies. Counterintuitively, though, it also led to a flow of capital from poor to rich countries. Simply put, Chinese workers supported the West’s living standards while Chinese production decimated Western manufacturing jobs. This imbalance has fueled protectionism, as governments respond to public pressure by restricting trade with low-cost producers, and contributed to the splintering of the world economy into rival economic blocs. 

Lastly, neoliberal economics is indifferent to rising inequality. Following four decades of hyper-globalization, tax cuts, and fiscal tightening, the richest 10% of the world’s population own 76% of the total wealth, while the poorest half own barely 2%. And as more and more wealth ends up in the hands of tech speculators and fraudsters, the so-called “effective altruism” movement has invoked Laffer curve-style logic to argue that allowing the rich to become even richer would encourage them to donate to charity. 

Will globalization’s second wave collapse into a world war, as the first one did? It is certainly possible, especially given the lack of intellectual heft among the current crop of world leaders. To prevent another descent into global chaos, we need bold ideas that build on the economic and political legacies of Bretton Woods and the 1945 UN Charter. The alternative could be a more or less direct path to Armageddon.

The Return of Thoughtcrime

The UK’s draconian Public Order Bill, which seeks to restrict certain forms of protest used by climate activists, will expand the state’s ability to detain people deemed disruptive and limit the courts’ ability to restrain it. This will align the British legal system with those of authoritarian countries like Russia.

LONDON – In December 1939, police raided the home of George Orwell, seizing his copy of D.H. Lawrence’s Lady Chatterley’s Lover. In a letter to his publisher after the raid, Orwell wondered whether “ordinary people in countries like England grasp the difference between democracy and despotism well enough to want to defend their liberties.”

Nearly a century later, the United Kingdom’s draconian Public Order Bill, passed by the UK House of Commons last year and now being considered in the House of Lords, vindicates Orwell’s doubt. The bill seeks to restrict the right to protest by extending the scope of criminality, reversing the presumption of innocence in criminal trials, and weakening the “reasonableness” test for coercive action. In other words, it widens the government’s scope for discretionary action while limiting the courts’ ability to restrain it.

When the police seized Orwell’s copy of Lady Chatterley’s Lover, the novel was banned under the Obscene Publications Act of 1857, which prohibited the publication of any material that might “deprave and corrupt” readers. In 1959, the nineteenth-century law was replaced by a more liberal measure that enabled publishers to defend against obscenity charges by showing that the material had artistic merit and that publishing it was in the public interest. Penguin Books succeeded with this defense when it was prosecuted for publishing Lady Chatterley’s Lover in 1960; by the 1980s, the book was taught in public schools.

But while Western democracies have stopped trying to protect adults from “depravity,” they are constantly creating new crimes to protect their “security.” The Public Order Bill creates three new criminal offenses: attaching oneself to objects or buildings (“locking on” or “going equipped to lock on”), obstructing major transport works, and interfering with critical national infrastructure projects. All three provisions target forms of peaceful protest, such as climate activists blocking roads or gluing themselves to famous works of art, that the government considers disruptive. Disrupting critical infrastructure could certainly be construed as a genuine threat to national security. But this bill, which follows a raft of other recently enacted or proposed laws intended to deal with “the full range of modern-day state threats,” should be seen as part of a broader government crackdown on peaceful protest.

By transferring the burden of proof from the police to alleged offenders, the Public Order Bill effectively gives police officers the authority to arrest a person for, say, “attaching themselves to another person.” Rather than requiring the police to show reasonable cause for the arrest, the person charged must “prove that they had a reasonable excuse” for locking arms with a friend.

The presumption of innocence is not just a legal principle; it is a key political principle of democracy. All law-enforcement agencies consider citizens potential lawbreakers, which is why placing the burden of proof on the police is an essential safeguard for civil liberties. The Public Order Bill’s presumption of guilt would reduce the extent to which the police are answerable to the courts, aligning the UK legal system with those of authoritarian countries like Russia and China, where acquittals are rare.

The bill also weakens the “reasonableness” requirement for detention and banning orders, allowing officers to stop and search any person or vehicle without any grounds for suspicion if they “reasonably believe” that a protest-related crime may be committed. Resistance to any such search or seizure would be a criminal offense. And magistrates could ban a person or organization from participating in a protest in a specified area for up to five years if their presence was deemed likely to cause “serious disruption.” And since being “present” at the crime scene includes electronic communications, the ban could involve digital monitoring.

The question of what should be considered reasonable grounds for coercive action was raised in the landmark 1942 case of Liversidge v. Anderson. Robert Liversidge claimed that he had been unlawfully detained on the order of then-Home Secretary John Anderson, who refused to disclose the grounds for the arrest. Anderson argued that he had “reasonable cause to believe” that Liversidge was a national-security threat, and that he had acted in accordance with wartime defense regulations that suspended habeas corpus. The House of Lords ultimately deferred to Anderson’s view, with the exception of Lord Atkin, who in his dissent accused his peers of being “more executive-minded than the executive.”

Even in wartime, Atkin claimed, individuals should not be arbitrarily detained or deprived of their property. If the state is not required to provide reasons that could stand up in court, the courts cannot restrain the government. The UK’s current wave of national security and counter-terrorism bills directly challenges this view, making Atkin’s dissent even more pertinent today than it was during the war.

Law enforcement’s growing use of big data and artificial intelligence makes the UK government’s efforts to curtail the right to protest even more worrisome. While preventive policing is not new, the appearance of scientific impartiality could give it unlimited scope. Instead of relying on informers, police departments can now use predictive analytics to determine the likelihood of future crimes. To be sure, some might argue that, because authorities have so much more data at their disposal, predictive policing is more feasible today than it was in the 1980s, when the British sociologist Jean Floud advocated “protective sentences” for offenders deemed a grave threat to public safety. American University law professor Andrew Guthrie Ferguson, for example, has argued that “big data will illuminate the darkness of suspicion.”

But when considering such measures, we should keep in mind that the state can sometimes be far more dangerous than terrorists, and certainly more than glued-down protesters. We must be as vigilant against the lawmaker as we are against the lawbreaker. After all, we do not need an algorithm – or Orwell – to tell us that handing the government extraordinary powers could go horribly wrong.

Too Poor for War

Nov 8, 2022 ROBERT SKIDELSKY and PHILIP PILKINGTON

Decades of deindustrialization have hollowed out the UK economy and made it woefully ill-prepared for wartime disruptions. As the financial speculators who funded its current-account deficits turn against the pound, policymakers should consider Keynesian taxes and increasing public investment.

LONDON – A wartime economy is inherently a shortage economy: because the government needs to direct resources toward manufacturing guns, less butter is produced. Because butter must be rationed to make more guns, a war economy may lead to an inflationary surge that requires policymakers to cut civilian consumption to reduce excess demand.

In his 1940 pamphlet “How to Pay for the War,” John Maynard Keynes famously called for fiscal rebalancing, rather than budgetary expansion, to accommodate the growing needs of the United Kingdom’s World War II mobilization effort. To reduce consumption without driving up inflation, Keynes contended, the government had to raise taxes on incomes, profits, and wages. “The importance of a war budget is social,” he asserted. Its purpose is not only to “prevent the social evils of inflation,” but to do so “in a way which satisfies the popular sense of social justice whilst maintaining adequate incentives to work and economy.”

Joseph E. Stiglitz recently applied this approach to the Ukraine crisis. To ensure the fair distribution of sacrifice, he argues, governments must impose a windfall-profit tax on domestic energy suppliers (“war profiteers”). Stiglitz proposes a “non-linear” energy-pricing system whereby households and companies could buy 90% of the previous year’s supply at last year’s price. In addition, he advocates import-substituting policies such as increasing domestic food production and greater use of renewables.

Stiglitz’s proposals may work for the United States, which is far less vulnerable to external disruption than European countries. With a quarter of the global GDP, 14% of world trade, and 60% of the world’s currency reserves, the US can afford belligerence. But the European Union cannot, and the UK even less so.

While the UK has been almost as aggressive as the US in its response to Russia’s actions, Britain is far less prepared to manage a war economy than it was in 1940: it makes fewer things, grows less food, and is more dependent on imports. The UK is more vulnerable to external shocks than any major Western power, owing to decades of deindustrialization that have shrunk its manufacturing sector from 23% of gross value added in 1980 to roughly 10% today. While the UK produced 78% of the food it consumed in 1984, this figure had fallen to 64% by 2019. The British economy’s growing reliance on imported energy has made it even less self-sufficient.

For decades, the financial sector propped up the UK’s hollowed-out economy. Financial flows into the City of London allowed the country to neglect trade and artificially maintain higher living standards than its export capacity warranted. Britain’s current-account deficit is now 7% of GDP, compared to a current-account surplus of 1.3% of GDP in 1980. Until recently, the British formula had been to finance its external deficit by attracting speculative capital into London via the financial industry, which had been deregulated by the “big bang” of 1986.

This was brilliant but unstable financial engineering: foreigners sent the UK goods that it otherwise could not afford, Britain sent them sterling in return, and foreigners used the pound to buy British-domiciled assets. But this was a short-term fix for the long-term decline of manufacturing, enabling the UK to live beyond its means without improving its productivity.

In his 1930 Treatise on Money, Keynes distinguishes between “financial circulation” and “industrial circulation.” The former is mainly speculative in purpose. But an economy that depends on speculative inflows experiences financial booms and busts without any improvement to its underlying growth potential. The UK’s strategy echoed this observation: it did little to develop exportable goods that could improve the current-account balance, and its success depended on foreigners not dumping the pound.

But the speculator’s logic, as George Soros explains, is to make a quick buck and get out before the crash. Relying on speculators is like a narcotics addiction: a temporary high becomes a necessary crutch. The energy crisis brought on by the Russia-Ukraine war was the equivalent of cold-turkey withdrawal, blowing an even larger hole in the UK’s trade balance. The current-account deficit is expected to increase to 10% of GDP by the end of 2023, providing short-term investors with a strong incentive to sell their sterling-denominated bonds.

The pound’s ongoing decline will make UK imports more expensive. And since import prices will likely rise faster than export values, the decline in the sterling’s exchange rate will probably widen the current-account deficit, not least because the country’s diminished manufacturing sector depends heavily on imported inputs. As the pound depreciates, the price of these imports will increase, resulting in even greater erosion of living standards.

This leaves policymakers with few good options. The Bank of England has already raised interest rates to maintain inflows of foreign capital, but high interest rates will likely crash housing and other asset markets that have become addicted to rock-bottom rates over the past 15 years. Taking steps to balance the budget may temporarily calm markets, but such measures would not address the British economy’s underlying weakness. Moreover, there is no evidence that fiscal consolidation leads to economic growth.

One possible remedy would be to revive government investment. UK public investment fell from an average of 47.3% of total investments between 1948 and 1976 to 18.4% between 1977 and 2007, leaving overall investment dependent on volatile short-term expectations.

The only way the UK could “pay for the war” is to implement an industrial strategy that aims to increase self-sufficiency in energy, raw materials, and food production. But such a policy will take years to bear fruit.

All European countries, not just Britain, face an energy crisis as a result of the disruption of oil and gas supplies from Russia, and policymakers are eager to increase energy inflows. But any deal with Russia as it wages its war on Ukraine with apparent disregard for human life would carry enormous moral and political costs.

One possible way forward may be to reach an agreement to ease economic sanctions in exchange for a resumption of gas flows. Given its special economic vulnerability, and following Brexit, Britain is well placed to explore this idea on behalf of – but independently from – the EU.

A limited agreement could ease Europe’s energy crisis while allowing continued military support for Ukraine. But it should be conditional on Russia reducing the intensity of its horrific “special military operation.” Negotiation of a limited energy-sanctions deal could, perhaps, open the door to a wider negotiation aimed at ending the war before it engulfs Europe.

As for the UK, in the short term it will remain dependent on City-generated financial inflows to prevent a catastrophic near-term collapse of the pound, forcing the new British Chancellor, Jeremy Hunt, to scramble to “restore confidence” in the British economy. In lieu of Keynesian taxes or public investment, that will most likely mean drinking more of the austerity poison that caused Britain’s current malady.

Gorbachev’s Tragic Legacy

Oct 19, 2022ROBERT SKIDELSKY

Admired in the West but loathed by his countrymen as a harbinger of Russia’s post-Cold War misfortune, Mikhail Gorbachev fully grasped the immense challenges of reforming the ailing Soviet Union. Today’s Russia largely reflects the anti-Western grievances stemming from his failure.

LONDON – Mikhail Gorbachev, the Soviet Union’s last leader, was buried last month at the Novodevichy Cemetery in Moscow next to his wife Raisa and near fellow Soviet leader Nikita Khrushchev. To no one’s surprise, Russian President Vladimir Putin did not attend the funeral. Novodevichy, after all, is where “unsuccessful” Soviet leaders had been consigned to their final rest.

Putin’s snub reminded me of a conversation I had two decades ago during a midnight stroll through Red Square. On impulse, I asked the army officer stationed in front of Lenin’s Tomb who was buried in the Soviet Necropolis behind it, and he offered to show me. There, I saw a succession of graves and plinths for Soviet leaders: from Stalin to Leonid Brezhnev, Alexei Kosygin, and Yuri Andropov. The last plinth was unoccupied. “For Gorbachev, I suppose?” I asked. “No, his place is in Washington,” the officer replied. 

Ironically, Gorbachev has been lionized in the West for accomplishing something he never set out to do: bringing about the end of the Soviet Union. He was awarded the Nobel Peace Prize in 1990, yet Russians widely regarded him as a traitor. In his ill-fated attempt at a political comeback in the 1996 Russian presidential election, he received just 0.5% of the popular vote

Gorbachev remains a reviled figure in Russia. A 2012 survey by the state-owned pollster VTsIOM found that Gorbachev was the most unpopular of all Russian leaders. According to a 2021 poll, more than 70% of Russians believe their country had moved in the wrong direction under his rule. Hardliners hate him for dismantling Soviet power, and liberals despise him for clinging to the impossible ideal of reforming the communist regime.

I became acquainted with Gorbachev in the early 2000s when I attended meetings of the World Political Forum, the think tank he founded in Turin. The organization was purportedly established to promote democracy and human rights. But in practice, its events were nostalgic reminiscences where Gorbachev held forth on “what might have been.” He was usually flanked by other has-beens of his era, including former Polish leaders Wojciech Jaruzelski and Lech Wałęsa, former Hungarian Prime Minister Gyula Horn, Russian diplomat Alexander Bessmertnykh, and a sprinkling of left-leaning academics. 

Gorbachev’s idea of a “Third Way” between socialism and capitalism was briefly fashionable in the West but was soon swamped by neoliberal triumphalism. Nonetheless, I liked and respected this strangely visionary leader of the dying USSR, who refused to use force to resist change.

Today, most Russians cast Gorbachev and Boris Yeltsin as harbingers of Russia’s misfortune. Putin, on the other hand, is widely hailed as a paragon of order and prosperity who has reclaimed Russia’s leading role on the world stage. In September, 60% of Russians said they believe that their country is heading in the right direction, though this no doubt partly reflects the tight control Putin exercises over television news (the main source of information for most citizens). 

In the eyes of most Russians, Gorbachev’s legacy is one of naivete and incompetence, if not outright betrayal. According to the prevailing narrative, Gorbachev allowed NATO to expand into East Germany in 1990 on the basis of a verbal commitment by then-US Secretary of State James Baker that the alliance would expand “not one inch eastwards.” Gorbachev, in this telling, relinquished Soviet control over Central and Eastern Europe without demanding a written assurance

In reality, however, Baker was in no position to make such a promise in writing, and Gorbachev knew it. Moreover, Gorbachev repeatedly confirmed over many years that a serious promise not to expand NATO eastward was never made

In any case, the truth is that the Soviet Union’s control over its European satellites had become untenable after it signed the 1975 Helsinki Final Act. The accord, signed by the United States, Canada, and most of Europe, included commitments to respect human rights, including freedom of information and movement. Communist governments’ ability to control their population gradually eroded, culminating in the chain of mostly peaceful uprisings that ultimately led to the Soviet Union’s dissolution. 

Yet there is a grain of truth to the myth of Gorbachev’s capitulation. After all, the USSR had not been defeated in battle, as Germany and Japan were in 1945, and the formidable Soviet military machine remained intact in 1990. In theory, Gorbachev could have used tanks to deal with the popular uprisings in Eastern Europe, as his predecessors did in East Germany in 1953, Hungary in 1956, and Czechoslovakia in 1968. 

Gorbachev’s refusal to resort to violence to preserve the Soviet empire resulted in a bloodless defeat and a feeling of humiliation among Russians. This sense of grievance has fueled widespread distrust of NATO, which Putin used years later to mobilize popular support for his invasion of Ukraine. 

Another common misconception is that Gorbachev dismantled a functioning economic system. In fact, far from fulfilling Khrushchev’s promise to “bury” the West economically, the Soviet economy had been declining for decades. 

Gorbachev understood that the Soviet Union could not keep up with the US militarily while satisfying civilian demands for higher living standards. But while he rejectedthe Brezhnev era’s stagnation-inducing policies, he had nothing coherent to put in their place. Instead of facilitating a functioning market economy, his rushed abandonment of the central-planning system enriched the corrupt managerial class in the Soviet republics and led to a resurgence of ethnic nationalism. 

To my mind, Gorbachev is a tragic figure. While he fully grasped the immense challenges facing Soviet communism, he had no control over the forces he helped unleash. Russia in the 1980s simply lacked the intellectual, spiritual, and political resources to overcome its underlying problems. But while the Soviet empire has been extinct for 30 years, many of the dysfunctions that contributed to its demise now threaten to engulf the entire world.

Requiem for an Empire

Sep 12, 2022 ROBERT SKIDELSKY

Since World War II, Britain’s influence in the world has relied on its “special relationship” with the United States, its position as head of the Commonwealth (the British Empire’s successor), and its position in Europe. The Americans are still there, but Europe isn’t, and now the head of the Commonwealth isn’t, either.

LONDON – Amid the many, and deserved, tributes to Queen Elizabeth II, one aspect of her 70-year reign remained in the background: her role as monarch of 15 realms, including Australia, New Zealand, and Canada. She was also the head of the Commonwealth, a grouping of 56 countries, mainly republics.

This community of independent states, nearly all of them former territories of the British Empire, has been crucial in conserving a “British connection” around the world in the post-imperial age. Whether this link is simply a historical reminiscence, whether it stands for something substantial in world affairs, and whether and for how long it can survive the Queen’s passing, have become matters of great interest, especially in light of Britain’s withdrawal from the European Union.

In the nineteenth-century era of Pax Britannica, Britain exercised global power on its own. The sun never set on the British Empire: the British navy ruled the waves, British finance dominated world markets, and Britain maintained the European balance of power. This era of “splendid isolation” – never as splendid or isolated as history textbooks used to suggest – ended with World War I, which gravely wounded Britain’s status as a world power and correspondingly strengthened other claimants to that role.

As the results of WWI were confirmed by World War II, British foreign policy came to center on the doctrine of the “three circles.” Britain’s influence in the world would rely on its “special relationship” with the United States, its position as head of the Commonwealth (the empire’s successor), and its position in Europe. By its membership of these overlapping and mutually reinforcing circles, Britain might hope to maximize its hard and soft power and mitigate the effects of its military and economic “dwarfing.”

Different British governments attached different weights to the three roles in which Britain was cast. The most continuously important was the relationship with the US, which dates from WWII, when the Americans underwrote Britain’s military and economic survival. The lesson was never forgotten. Britain would be the faithful partner of the US in all its global enterprises; in return, Britain could draw on an American surplus of goodwill possessed by no other foreign country. For all the pragmatic sense it made, one cannot conceive of such a connection forged or enduring without a common language and a shared imperial history.

Imperial history was also central to the second circle. The British Empire of 1914 became the British Commonwealth in 1931, and finally just The Commonwealth, with the Queen as its titular head. Its influence lay in its global reach. Following the contours of the British Empire, it was the only world organization (apart from the United Nations and its agencies) which spanned every continent.

The Commonwealth conserved the British connection in two main ways. First, it functioned as an economic bloc through the imperial preference system of 1932 and the sterling area that was formalized in 1939, both of which survived into the 1970s. Second, and possibly more durably, its explicitly multiracial character, so ardently supported by the Queen, served to soften both global tensions arising from ethnic nationalism, and ethnic chauvinism in the “mother country.” Multicultural Britain is a logical expression of the old multicultural empire.

The European link was the weakest and was the first to snap. This was because Britain’s historic role in Europe was negative: to prevent things from happening there which might endanger its military security and economic livelihood. To this end, it opposed all attempts to create a continental power capable of bridging the Channel. Europe was just 20 miles away, and British policy needed to be ever watchful that nasty things did not happen “over there.”

John Maynard Keynes expressed this permanent sense of British estrangement from the Continent. “England still stands outside Europe,” he wrote in 1919. “Europe’s voiceless tremors do not reach her: Europe is apart and England is not of her flesh and body.” The Labour leader, Hugh Gaitskell, famously evoked this sense of separation when he played the Commonwealth card in 1962, urging his party not to abandon “a thousand years of history” by joining the European Economic Community.

Britain’s policy towards Europe has always been to prevent the emergence of a Third Force independent of US-led NATO. Charles de Gaulle saw this clearly, vetoing Britain’s first application to join the EEC in 1963 in order to prevent an American “Trojan Horse” in Europe.

Although Prime Minister Tony Blair wanted Britain to be at “the heart” of Europe, Britain pursued the same game inside the EU from 1974 until 2021. The only really European-minded prime minister in this period was Edward Heath. Otherwise, British governments have sought to maximize the benefits to Britain of trade and tourism, while minimizing the dangers of political contamination. Today, it is not surprising that Britain joins the US to project NATO power in Eastern Europe over the stricken torso of the EU itself.

So, Britain is left with just two circles. In the wake of Brexit, the Queen’s legacy is clear. Through her official position and personal qualities, she preserved the Commonwealth as a possible vehicle for projecting what remains of Britain’s hard power, such as military alliances in the South Pacific. And whatever one may think of Britain’s hard power, its soft power – reflecting its trading relationships, its cultural prestige in Asia and Africa, and its multicultural ideal – is a global public good in an age of growing ethnic, religious, and geopolitical conflict.

I doubt whether the two remaining circles can compensate for Britain’s absence from the third. The question that remains to be answered is how much the Commonwealth’s durability depended on the sheer longevity of the late monarch, and how much of it can be preserved by her successor.

Mind the Policy Gaps

Aug 22, 2022 ROBERT SKIDELSKY

The widening gaps in policy formation nowadays reflect the division of labor and increasing specialization that has taken us from the sixteenth-century ideal of the Renaissance man. And today’s biggest policymaking gap has grown so large that it threatens global catastrophe.

LONDON – Just as the insistent demand for more “transparency” is a sure sign of increasing opacity, the current clamor for “joined-up thinking” indicates that the need for it far outstrips the supply. With its recent report on energy security, the UK House of Lords Economic Affairs Committee has added its voice to the chorus.

The report’s language is restrained, but its message is clear: Without a “joined-up” energy policy, the United Kingdom’s transition to net zero by 2050 will be “disorderly” (read: “will not happen”). For example, the policy aimed at improving home insulation is at odds with local authorities’ listed-building regulations.

In April, the government called on the Bank of England and financial regulators to “have regard to” energy security. What does this mean? Which institution is responsible for which bits of energy security? How does energy security relate to the net-zero goal? Never mind gaps in the data: The real problem is yawning chasms in thinking.

In a masterly understatement, the committee’s report says that “The Russian invasion of Ukraine has created global energy supply issues.” In fact, economic sanctions against the invader have contributed significantly to a massive energy and food crisis that threatens the sanctioning countries with stagflation and many people in developing economies with starvation.1

China provides key minerals for renewable-energy technologies, including wind turbines and solar cells, and supplies 66% of finished lithium-ion batteries. The report concludes that the UK must “not become reliant on strategic competitors, notably China, for critical minerals and components.” Moreover, the government “will need to ensure that its foreign and trade policies […] and its policy on net zero are aligned.” Quite a bit more joining up to do, then.

The widening gaps in policy formation reflect the increasing division of labor resulting from the relentless march of complexity. Today’s policymakers and their advisers know more and more about less and less, recalling Adam Smith’s description in The Wealth of Nations of a pin factory worker:

“The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same, has no occasion to exert his understanding, or to exercise his invention in finding out expedients for removing difficulties which never occur. He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become.”

No one in Smith’s factory would need to know how to make a pin, or even what the purpose of producing one was. They would know only how to make a part of a pin. Likewise, the world is becoming full of “experts” who know only little bits of their subject.

The ideal of the “Renaissance man,” who could do a lot of joined-up thinking, did not survive the growing division of labor. By the eighteenth century, knowledge was being split up into “disciplines.” Now, subdisciplines sprout uncontrollably, and communication of their findings to the public is left to journalists who know almost nothing about everything.

Today’s biggest gap, one so large that it threatens catastrophe, is between geopolitics and economics. Foreign ministries and Treasuries no longer talk to each other. They inhabit different worlds, use different conceptual languages, and think about different problems.

The geopolitical world is divided up into “strategic partners” and “strategic rivals.” Borders are alive and well. States have conflicting national interests and pursue national-security policies. Economics, in contrast, is the science of a single market: Its ideal is economic integration across frontiers and a global price mechanism that automatically harmonizes conflicting preferences. Economics also tells us that commerce softens the asperities of politics, creating over time a single community of learning and culture.

The eighteenth-century English historian Edward Gibbon described history as “little more than the register of the crimes, follies, and misfortunes of mankind.” But the end of the Cold War led to hopes that the world was at last growing up; in 1989, the American academic Francis Fukuyama proclaimed the “end of history.”

Today, however, geopolitics is back in the saddle. The West regards Russia as a pariah because of its invasion of Ukraine; China, if not yet quite meriting that status, is a strategic rival with which trade in many areas should be shunned. At the same time, Western governments also insist on the need for global cooperation to tackle potentially lethal climate change and other existential dangers like nuclear proliferation and pandemics.

In 2012, for example, the University of Cambridge established the Centre for the Study of Existential Risk to help mitigate threats that could lead to humanity’s extinction. Sixty-five years earlier, atomic scientists from the Manhattan Project founded a bulletin to warn humanity of the possible negative consequences arising from accelerating technological advances. They started a Doomsday Clock, the time of which is announced each January. In 1947, the clock was set at seven minutes to midnight. Since January 2020, it has stood at 100 seconds to midnight, marking the closest humanity has come to Armageddon in 75 years.

The danger of nuclear proliferation, which prompted the clock’s creation, is still very much with us. As the bulletin points out, “Development of hypersonic glide vehicles, ballistic missile defenses, and weapons-delivery systems that can flexibly use conventional or nuclear warheads may raise the probability of miscalculation in times of tension.” Do those who urge the expulsion of Russian forces from Ukraine, by economic or military means, consider this risk? Or is this another “gap” in their knowledge of the world?

It is a tragedy that the economic basis of today’s world order, such as it is, is now being put at risk. That risk is the result of actions for which all the great powers share responsibility. The ironically named United Nations, which exists to achieve planetary security, is completely marginalized. Its virtual absence from the big scenes of international conflict is the biggest gap of all, and one which jeopardizes our common future.

Boris Johnson’s Fall – and Ours

Jul 19, 2022 ROBERT SKIDELSKY

Although words like “unprincipled,” “amoral,” and “serial liar” seem to describe the outgoing British prime minister accurately, they accurately describe more successful political leaders as well. To explain Johnson’s fall, we need to consider two factors specific to our times.

LONDON – Nearly all political careers end in failure, but Boris Johnson is the first British prime minister to be toppled for scandalous behavior. That should worry us.

The three most notable downfalls of twentieth-century British leaders were caused by political factors. Neville Chamberlain was undone by his failed appeasement policy. The Suez fiasco forced Anthony Eden to resign in 1957. And Margaret Thatcher fell in 1990 because popular resistance to the poll tax persuaded Tory MPs that they could not win again with her as leader. 

True, Harold Macmillan was undone in 1963 by the Profumo sex scandal, but this involved a secretary of state for war and possible breaches of national security. Election defeats following economic failure brought down Edward Heath and James Callaghan in the 1970s. Tony Blair was forced to resign by the Iraq debacle and Gordon Brown’s impatience to succeed him. David Cameron was skewered by Brexit, and Theresa May by her failure to deliver Brexit. 

No such events explain Johnson’s fall. 

David Lloyd George, a much greater leader than Johnson, is his only serious rival in sleaze. But though the sale of seats in the House of Lords, slipshod administrative methods, and dishonesty had weakened Lloyd George, the immediate cause of his fall (exactly a century ago) was his mishandling of the Chanak crisis, which brought Britain and Turkey to the brink of war. 

The more familiar comparison is with US President Richard Nixon. Every Johnson misdemeanor is routinely labeled “gate” after the Watergate break-in that ended Nixon.

John Maynard Keynes called Lloyd George a “crook”; Nixon famously denied that he was one. Neither they nor Johnson were crooks in the technical sense (of being convicted of crimes), but Nixon would have been impeached in 1974 had he not resigned, and Johnson was fined £50 for breaking lockdown rules. Moreover, all three showed contempt for the laws they were elected to uphold, and for the norms of conduct expected from public officials. 

We struggle to describe their character flaws: “unprincipled,” “amoral,” and “serial liar” seem to capture Johnson. But they describe more successful political leaders as well. To explain his fall, we need to consider two factors specific to our times. 

The first is that we no longer distinguish personal qualities from political qualities. Nowadays, the personal really ispolitical: personal failings are ipso facto political failings. Gone is the distinction between the private and the public, between subjective feeling and objective reality, and between moral and religious matters and those that government must address. 

Politics has crossed into the realm previously occupied by psychiatry. This was bound to happen once affluence undermined the old class basis of politics. Questions of personal identity arising from race, gender, sexual preference, and so on now dominate the spaces vacated by the politics of distribution. Redressing discrimination, not addressing inequality, became the task of politics. 

Johnson is both a creature and a victim of identity politics. His rhetoric was about “leveling up” and “our National Health Service.” But, in practice, he made his personality the content of his politics. No previous British leaders would have squandered their moral capital on trivial misdemeanors and attempted cover-ups, because they knew that it had to be kept in reserve for momentous events. But momentous events are now about oneself, so when a personality is seen as flawed, there is no other story to tell. 

Johnson’s personality-as-politics was also the creation of the media. In the past, newspapers, by and large, reported the news; now, focusing on personalities, they create it. This change has given rise to a corrupt relationship: personalities use the media to promote themselves, and the media expose their frailties to sell copy. 

There has always been a large market for sexual and financial gossip. But even in the old “yellow press,” there was a recognized sphere of public events that took priority. Now the gossip stories are the public events. 

This development has radically transformed public perceptions about the qualities a political leader should have. Previous generations of political leaders were by no means all prudes. They lied, drank, fornicated, and took bribes. But everyone concerned with politics recognized that it was important to protect the public sphere. Leaders’ moral failings were largely shielded from scrutiny, unless they became egregious. And even when the public became aware of them, they were forgiven, provided the leaders delivered the goods politically. 

Most of the offenses that led to Johnson’s resignation would never have been reported in the past. But today the doctrine of personal accountability justifies stripping political leaders naked. Every peccadillo, every lapse from correct expression, becomes a credibility-destroying “disgrace” or “shame.” People’s ability to operate in the public sphere depends on privacy. Once that is gone, their ability to act effectively when they need to vanishes. 

The other new factor is that politics is no longer viewed as a vocation so much as a stepping stone to money. Media obsession with what a political career is worth, rather than whether politicians are worthy of their jobs, is bound to affect what politically ambitious people expect to achieve and the public’s view of what to expect from them. Blair is reported to have amassed millions in speaking engagements and consultancies since leaving office. In keeping with the times, The Times has estimated how much money Johnson could earn from speaking fees and book deals, and how much more he is worth than May. 

In his resignation speech, Johnson sought to defend the “best job in the world” in traditional terms, while criticizing the “eccentricity” of being removed in mid-delivery of his promises. But this defense of his premiership sounded insincere, because his career was not a testimony to his words. The cause of his fall was not just his perceived lack of morality, but also his perceived lack of a political compass. For Johnson, the personal simply exposed the hollowness of the political.

Russia’s Path to Premodernity

Jun 14, 2022 ROBERT SKIDELSKY

The Stalinist retreat from science and logic persisted following the Soviet Union’s collapse and is now the main tendency of Russian President Vladimir Putin’s rule. With his faith-based mythology, warping of history, and denial of facts, Putin’s withdrawal from contemporary Europe could not be starker.

LONDON – The Russian writer Pyotr Chaadayev said of his country that “we have never advanced along with other people; we are not related to any of the great human families; we belong neither to the West nor to the East, and we possess the traditions of neither. Placed, as it were, outside of the times,” he wrote, “we have not been affected by the universal education of mankind.”

That was in 1829. The “riddle, wrapped in a mystery, inside an enigma,” as Winston Churchill described Russia more than a century later, is no closer to being solved today. The philosopher John Gray recently wrote that Russian President Vladimir Putin “is the face of a world the contemporary Western mind does not comprehend. In this world, war remains a permanent part of human experience; lethal struggles over territory and resources can erupt at any time; human beings kill and die for the sake of mystical visions.” That is why Western commentators and liberal Russians are baffled by Putin’s so-called “special military operation” in Ukraine.

Personality-based explanations for Putin’s actions are the easiest to advance – and the most facile. Putin is neither acting like an expert chess player, calculating every move, nor like a ruler unhinged by power or steroids.

Rather, Putin has a distorted, or at least one-sided, view of Russian history, and of what constitutes Russia’s special virtue. But this does not explain the widespread popular and intellectual support in Russia for his justificatory narrative regarding Ukraine. We are all to some extent captives of our national myths. It is just that Russian mythology is out of step with “the universal education of mankind.”

We expect Russia to behave more or less like a modern, or even postmodern, European nation-state, but forget that it missed out on three crucial ingredients of European modernization. First, as Yuri Senokosov has written, Russia never went through the Reformation or had its age of Enlightenment. This, Senokosov argues, is because “serfdom was abolished only in 1861 and the system of Russian autocracy collapsed only in 1917 […] It was then swiftly restored.” As a result, Russia never experienced the period of bourgeois civilization which, in Europe, established the outlines of the constitutional state.

Second, Russia was always an empire, never a nation-state. Autocracy is its natural form of rule. To its current czar, the disintegration of the Soviet Union in 1991 was a violation of Russian history.

The third missing ingredient, related to the absence of the first two, was liberal capitalism, of which Russia had only brief and limited experience. Marx insisted that the capitalist phase of economic development had to precede socialism, because any attempt to build an industrial economy on the archaic soil of peasant primitivism was bound to lead to despotism.

Yet, this is exactly what Lenin’s revolutionary formula of “Soviet power plus the electrification of the whole country” amounted to. Lenin, a brilliant opportunist, was following in the tradition of the great reforming czars who tried to westernize Russian society from the top. Peter the Great demanded that Russian men shave their beards and instructed his boyars: “Don’t gorge like a pig; don’t clean your teeth with a knife; don’t hold bread to your chest while cutting it.”

In the nineteenth century, Russia’s relationship with Europe took on a new dimension with the idea of the New Man – a Western type inextricably linked to Enlightenment philosophy and enthusiastic about science, positivism, and rationality. He appears as Stoltz in Ivan Goncharov’s 1859 novel Oblomov. In Ivan Turgenev’s Fathers and Sons (1862), he is the nihilist “son” Bazarov, who champions science and rails against his family’s irrational traditions. Nikolai Chernyshevsky’s novel What Is to Be Done? (1863), which strongly influenced Lenin, imagines a society of glass and steel built on scientific reason.

Because of their shallow roots in Russian culture, these futuristic projections incited a literary peasants’ revolt. Fyodor Dostoevsky’s Notes from the Underground, published in 1864, not only became one of the canonical texts of Christian Slavophilia, but also raised profound questions about modernity itself.

The Bolsheviks made the greatest collective attempt to bring the New Man out of literature and into the world. They, like Peter the Great, understood that transforming a society required transforming the people in it. They launched a concerted effort, with the participation of the foremost avant-garde artists of the time, to modernize people’s mindsets and nurture their revolutionary consciousness. Russians would become the scientifically and collectively minded New Men who would help build the Communist Utopia.

This was perhaps the biggest failure of all. With Stalin deeming socialism achieved in 1936, and state-mandated socialist realist literature and art exalting mysticism over science, Soviet dreams of a New Man remained just that. The retreat from science and logic survived the Soviet Union’s collapse and now is the animating tendency of Putin’s rule. His own faith-based mythology, unusual symbiotic relationship with the Orthodox Patriarch Kirill of Moscow, warping of history, and denial of facts, underscore the extent of Russia’s withdrawal from contemporary Europe.

In his 2003 book The Breaking of Nations, the former European Union diplomat Robert Cooper thought Russia’s future was still open. The signing of the Conventional Armed Forces in Europe Treaty and later Russian moves to join NATO indicated that “postmodern elements” were “trying to get out.” Whether the rapprochement was foiled by Western arrogance or Russian incompatibility will long be debated. By 2004, Putin had shed most of his liberalizing tendencies and began embracing traditionalism. In Cooper’s classification, Russia is a modern pre-modern state.

Following the Soviet Union’s 1968 invasion of Czechoslovakia, the Czech writer Milan Kundera refused to adapt Dostoevsky’s The Idiot for the stage. “Dostoevsky’s universe of overblown gestures, murky depths, and aggressive sentimentality repelled me,” Kundera said. It is in these murky depths, behind the rational façade, that we can glimpse Putin’s war.