House of Lords Speech on the Public Order Bill

My Lords, it is very cold in this House; I wonder what has happened to the heating. It certainly has a chilling effect on debate.

I am not a lawyer like the noble Lord, Lord Sandhurst, nor a policeman like the noble Lord, Lord Paddick. I am driven to take part in the debate because I have become increasingly concerned at the wide powers of surveillance and control being claimed by Governments in the name of public order and national security—powers that, in their structure though not yet in the scale of their implementation, resemble those in countries such as Russia and China.

I recall that George Orwell wrote in 1939 about

“whether the ordinary people in countries like England grasp the difference between democracy and despotism well enough to want to defend their liberties. One can’t tell until they see themselves menaced in some quite unmistakeable manner.”

People feel menaced in different ways; I myself have been woken up by one such menacing experience. I hope also to bring some historical perspective to the topic we are discussing.

The traditional aim of public order Acts, starting in 1936, was to prevent violent clashes on the streets. A famous common-law precedent was Wise v Dunning in 1902. Wise, a rabid anti-Papist, whose habit of speaking and dressing in a manner offensive to Catholics in Liverpool had led to fights at previous meetings, was bound over to keep the peace. The principle was clear enough: freedom of speech, procession and assembly must not be carried to the point where it caused violence on the streets.

As most noble Lords have pointed out, we already have plenty of Acts designed to prevent disruptive behaviour. Why do we need more? As the noble Lord, Lord Paddick, said, it is not because many of these measures have been demanded by the police. The noble Baroness, Lady Chakrabarti, suggested an answer that I find extremely convincing. This Bill brings peaceful, if inconvenient, protest and incitement to violence and terrorism into the same legal framework, implying in principle that the first is as culpable as the second. This argument is used to extend the powers of the state in dangerous ways, which have been charted only in despotic systems. That is why I talk about an Orwellian creep and cited George Orwell at the beginning.

I take up just two matters from Parts 2 and 3 of the Bill, consequential on this false identification between peaceful protest and violence and terrorism. The first, which other noble Lords have alluded to, is the extension of the police’s stop and search powers. In the past, stop and search powers have been used to prevent only the most serious offending, such as serious violence or reasonable suspicion of terrorism—for example, if people were suspected of carrying knives, guns or explosives. This was seriously open to racial discrimination and was highly controversial, but I can see a justification for the power itself. However, the Bill would extend the same powers of stop and search to the protest context. 

Someone can be stopped and searched for being suspected of being linked, however peripherally, to non-violent purposes or conduct. To stop and search someone suspected of carrying a bomb is one thing; to stop and search someone suspected of carrying a bicycle lock seems to me, to put it mildly, disproportionate—and, in fact, mad.

This leads me to my second point, to which I can hardly do justice in a short speech, namely the extremely worrying spread of arrest and detention where there is no reasonable suspicion that the person may be involved in proscribed behaviour, or where there is merely a balance of probabilities—I want to come back to that term—that they might be.

Clause 11 creates a new suspicion-less stop and search power, whereby the police will have the power to specify that, in a particular locality and for a particular period of time, they do not need to have reasonable suspicion—in other words, an objective basis for suspicion based on evidence—that a protest-related offence will be committed, before stopping and searching people for a prohibited object. This is similar to powers contained in anti-terrorist legislation. Let me quote from the public information leaflet issued to explain Schedule 3 of the Counter-Terrorism and Border Security Act 2019:

“Unlike most police powers, the power to stop, question, search and, if necessary detain persons does not require any suspicion … The purpose is to determine whether a person appears to be, or to have been, engaged in Hostile … activity.”

Leave to one side the draconian powers being asserted here; it is surely fantastic to apply the same reasoning and powers to someone who might or might not be carrying a paintbrush.

Almost as bad as suspicionless stop and search is Clause 20, which authorises serious disruption prevention orders. Many noble Lords have talked about these. They allow a court to ban a person from attending demonstrations and protests for up to two years, not on conviction of any offence but on a balance of probabilities that, on at least two occasions in the previous five years, they have carried out activities related to a protest or caused or contributed to someone else carrying out a protest. Failure to comply with SDPO conditions is a criminal offence, subject to 51 weeks’ imprisonment.

The balance of probabilities means that the court must think that it is 51% likely that the person concerned has carried out such activities. If it thinks that it is only 49% likely, they get off free. What sort of evidence is needed to make that kind of calculation? I would be grateful if that could be explained. The essential point is that Clause 20 allows standards of proof appropriate in civil cases to be used for imposing criminal sanctions, such as electronic tagging, on individuals convicted of no criminal offence.

Any serious analyst of these measures would need to trace not only the growth of novel forms of protest, which is acknowledged, but the way that concepts such as dangerousness and mens rea—guilty mind—have penetrated into the heart of our criminal justice system, creating a large and growing area of law in which you do not have to have done anything criminal to have been deprived of large chunks of your liberty.

It would be very difficult to amend the Bill to make it compliant with the European Convention on Human Rights. I therefore agree with those noble Lords who want to reject Parts 2 and 3 and seriously amend Part 1.

Gorbachev’s Tragic Legacy

Oct 19, 2022ROBERT SKIDELSKY

Admired in the West but loathed by his countrymen as a harbinger of Russia’s post-Cold War misfortune, Mikhail Gorbachev fully grasped the immense challenges of reforming the ailing Soviet Union. Today’s Russia largely reflects the anti-Western grievances stemming from his failure.

LONDON – Mikhail Gorbachev, the Soviet Union’s last leader, was buried last month at the Novodevichy Cemetery in Moscow next to his wife Raisa and near fellow Soviet leader Nikita Khrushchev. To no one’s surprise, Russian President Vladimir Putin did not attend the funeral. Novodevichy, after all, is where “unsuccessful” Soviet leaders had been consigned to their final rest.

Putin’s snub reminded me of a conversation I had two decades ago during a midnight stroll through Red Square. On impulse, I asked the army officer stationed in front of Lenin’s Tomb who was buried in the Soviet Necropolis behind it, and he offered to show me. There, I saw a succession of graves and plinths for Soviet leaders: from Stalin to Leonid Brezhnev, Alexei Kosygin, and Yuri Andropov. The last plinth was unoccupied. “For Gorbachev, I suppose?” I asked. “No, his place is in Washington,” the officer replied. 

Ironically, Gorbachev has been lionized in the West for accomplishing something he never set out to do: bringing about the end of the Soviet Union. He was awarded the Nobel Peace Prize in 1990, yet Russians widely regarded him as a traitor. In his ill-fated attempt at a political comeback in the 1996 Russian presidential election, he received just 0.5% of the popular vote

Gorbachev remains a reviled figure in Russia. A 2012 survey by the state-owned pollster VTsIOM found that Gorbachev was the most unpopular of all Russian leaders. According to a 2021 poll, more than 70% of Russians believe their country had moved in the wrong direction under his rule. Hardliners hate him for dismantling Soviet power, and liberals despise him for clinging to the impossible ideal of reforming the communist regime.

I became acquainted with Gorbachev in the early 2000s when I attended meetings of the World Political Forum, the think tank he founded in Turin. The organization was purportedly established to promote democracy and human rights. But in practice, its events were nostalgic reminiscences where Gorbachev held forth on “what might have been.” He was usually flanked by other has-beens of his era, including former Polish leaders Wojciech Jaruzelski and Lech Wałęsa, former Hungarian Prime Minister Gyula Horn, Russian diplomat Alexander Bessmertnykh, and a sprinkling of left-leaning academics. 

Gorbachev’s idea of a “Third Way” between socialism and capitalism was briefly fashionable in the West but was soon swamped by neoliberal triumphalism. Nonetheless, I liked and respected this strangely visionary leader of the dying USSR, who refused to use force to resist change.

Today, most Russians cast Gorbachev and Boris Yeltsin as harbingers of Russia’s misfortune. Putin, on the other hand, is widely hailed as a paragon of order and prosperity who has reclaimed Russia’s leading role on the world stage. In September, 60% of Russians said they believe that their country is heading in the right direction, though this no doubt partly reflects the tight control Putin exercises over television news (the main source of information for most citizens). 

In the eyes of most Russians, Gorbachev’s legacy is one of naivete and incompetence, if not outright betrayal. According to the prevailing narrative, Gorbachev allowed NATO to expand into East Germany in 1990 on the basis of a verbal commitment by then-US Secretary of State James Baker that the alliance would expand “not one inch eastwards.” Gorbachev, in this telling, relinquished Soviet control over Central and Eastern Europe without demanding a written assurance

In reality, however, Baker was in no position to make such a promise in writing, and Gorbachev knew it. Moreover, Gorbachev repeatedly confirmed over many years that a serious promise not to expand NATO eastward was never made

In any case, the truth is that the Soviet Union’s control over its European satellites had become untenable after it signed the 1975 Helsinki Final Act. The accord, signed by the United States, Canada, and most of Europe, included commitments to respect human rights, including freedom of information and movement. Communist governments’ ability to control their population gradually eroded, culminating in the chain of mostly peaceful uprisings that ultimately led to the Soviet Union’s dissolution. 

Yet there is a grain of truth to the myth of Gorbachev’s capitulation. After all, the USSR had not been defeated in battle, as Germany and Japan were in 1945, and the formidable Soviet military machine remained intact in 1990. In theory, Gorbachev could have used tanks to deal with the popular uprisings in Eastern Europe, as his predecessors did in East Germany in 1953, Hungary in 1956, and Czechoslovakia in 1968. 

Gorbachev’s refusal to resort to violence to preserve the Soviet empire resulted in a bloodless defeat and a feeling of humiliation among Russians. This sense of grievance has fueled widespread distrust of NATO, which Putin used years later to mobilize popular support for his invasion of Ukraine. 

Another common misconception is that Gorbachev dismantled a functioning economic system. In fact, far from fulfilling Khrushchev’s promise to “bury” the West economically, the Soviet economy had been declining for decades. 

Gorbachev understood that the Soviet Union could not keep up with the US militarily while satisfying civilian demands for higher living standards. But while he rejectedthe Brezhnev era’s stagnation-inducing policies, he had nothing coherent to put in their place. Instead of facilitating a functioning market economy, his rushed abandonment of the central-planning system enriched the corrupt managerial class in the Soviet republics and led to a resurgence of ethnic nationalism. 

To my mind, Gorbachev is a tragic figure. While he fully grasped the immense challenges facing Soviet communism, he had no control over the forces he helped unleash. Russia in the 1980s simply lacked the intellectual, spiritual, and political resources to overcome its underlying problems. But while the Soviet empire has been extinct for 30 years, many of the dysfunctions that contributed to its demise now threaten to engulf the entire world.

Economy: The Growth Plan 2022

House of Lords Speech: 10 October 2022

My Lords, the twin problems to which the mini-Budget was addressed were near-zero growth and a relentless rise in prices. I doubt whether it will do very much for the first—certainly not in time to offset the second. In the short run, what we face is not a growth crisis but an inflationary crisis and that, of course, also means a currency crisis.

What was the growth strategy? I think it was based on Reaganomics—the idea that unfunded tax cuts, by incentivising the wealthy to work hard and invest more, would pay for themselves. That was a sort of Laffer curve idea, which was very popular in the 1980s. The British Treasury never bought it; it always thought that tax cuts to encourage the wealthy would need to be complemented by welfare cuts to incentivise the poor to “get on their bikes”, in the famous phrase.

Well, that is the basis of growth orthodoxy but it is very insecure. There is no evidence that tax cuts for the rich speed up the real rate of economic growth. What they do encourage is speculation in financial assets and real estate. Also, there is no correlation between the rate of growth and the size of the public sector. So the growth strategy is very insecurely based. As the noble Lord, Lord Eatwell, pointed out earlier, public investment, not public ownership, has been the main growth engine since the war.

Apart from its intellectual incoherence, the Chancellor’s mini-Budget sets out to tackle the wrong problem. The problem, as Keynes wrote in 1939, is not how to get growth but

“how to pay for the war”.

We have blundered inadvertently into a war situation, and that creates war problems. A war economy is inherently inflationary: too much consumer demand, too little supply. This is our situation. Excess demand is easy enough to explain. For over a year, from 2020 to 2021, the Government paid a large chunk of the workforce to not work. Pent-up demand exploded before supply could catch up.

That is one part of it but, in addition, Russia’s invasion of Ukraine has produced big supply shortages, reflected in the near doubling of wholesale energy prices. They would have trebled had it not been for the energy price cap. How long can the Government go on capping prices without raising taxes? Is the Minister expecting energy supply in Europe to increase over the next few years? Is he expecting Saudi Arabia to increase rather than reduce supply? The important point in these questions is that, when debt costs are rising, the Government should be reducing and not increasing their borrowing.

Today, we are significantly less able to run a war economy than we were in 1940. We make fewer things, grow less food and are more dependent on foreign supplies. Extensive deindustrialisation since the 1980s has made our standard of living dependent on the City of London’s ability to finance our twin deficits—budget and current account—which are rising towards 10% of GDP. The City attracts capital into London to engage in financial investment. The energy crisis has blown a hole in the current account. Banks have indicated that a 10% current account deficit will be very difficult for the City to finance. The second factor depressing sterling is, of course, the very high rate of inflation.

We need a credible currency to maintain our standard of living and there is nothing in the strategy of the mini-Budget that guarantees that. What we need is a co-ordinated policy that can communicate a clear path forward.

Requiem for an Empire

Sep 12, 2022 ROBERT SKIDELSKY

Since World War II, Britain’s influence in the world has relied on its “special relationship” with the United States, its position as head of the Commonwealth (the British Empire’s successor), and its position in Europe. The Americans are still there, but Europe isn’t, and now the head of the Commonwealth isn’t, either.

LONDON – Amid the many, and deserved, tributes to Queen Elizabeth II, one aspect of her 70-year reign remained in the background: her role as monarch of 15 realms, including Australia, New Zealand, and Canada. She was also the head of the Commonwealth, a grouping of 56 countries, mainly republics.

This community of independent states, nearly all of them former territories of the British Empire, has been crucial in conserving a “British connection” around the world in the post-imperial age. Whether this link is simply a historical reminiscence, whether it stands for something substantial in world affairs, and whether and for how long it can survive the Queen’s passing, have become matters of great interest, especially in light of Britain’s withdrawal from the European Union.

In the nineteenth-century era of Pax Britannica, Britain exercised global power on its own. The sun never set on the British Empire: the British navy ruled the waves, British finance dominated world markets, and Britain maintained the European balance of power. This era of “splendid isolation” – never as splendid or isolated as history textbooks used to suggest – ended with World War I, which gravely wounded Britain’s status as a world power and correspondingly strengthened other claimants to that role.

As the results of WWI were confirmed by World War II, British foreign policy came to center on the doctrine of the “three circles.” Britain’s influence in the world would rely on its “special relationship” with the United States, its position as head of the Commonwealth (the empire’s successor), and its position in Europe. By its membership of these overlapping and mutually reinforcing circles, Britain might hope to maximize its hard and soft power and mitigate the effects of its military and economic “dwarfing.”

Different British governments attached different weights to the three roles in which Britain was cast. The most continuously important was the relationship with the US, which dates from WWII, when the Americans underwrote Britain’s military and economic survival. The lesson was never forgotten. Britain would be the faithful partner of the US in all its global enterprises; in return, Britain could draw on an American surplus of goodwill possessed by no other foreign country. For all the pragmatic sense it made, one cannot conceive of such a connection forged or enduring without a common language and a shared imperial history.

Imperial history was also central to the second circle. The British Empire of 1914 became the British Commonwealth in 1931, and finally just The Commonwealth, with the Queen as its titular head. Its influence lay in its global reach. Following the contours of the British Empire, it was the only world organization (apart from the United Nations and its agencies) which spanned every continent.

The Commonwealth conserved the British connection in two main ways. First, it functioned as an economic bloc through the imperial preference system of 1932 and the sterling area that was formalized in 1939, both of which survived into the 1970s. Second, and possibly more durably, its explicitly multiracial character, so ardently supported by the Queen, served to soften both global tensions arising from ethnic nationalism, and ethnic chauvinism in the “mother country.” Multicultural Britain is a logical expression of the old multicultural empire.

The European link was the weakest and was the first to snap. This was because Britain’s historic role in Europe was negative: to prevent things from happening there which might endanger its military security and economic livelihood. To this end, it opposed all attempts to create a continental power capable of bridging the Channel. Europe was just 20 miles away, and British policy needed to be ever watchful that nasty things did not happen “over there.”

John Maynard Keynes expressed this permanent sense of British estrangement from the Continent. “England still stands outside Europe,” he wrote in 1919. “Europe’s voiceless tremors do not reach her: Europe is apart and England is not of her flesh and body.” The Labour leader, Hugh Gaitskell, famously evoked this sense of separation when he played the Commonwealth card in 1962, urging his party not to abandon “a thousand years of history” by joining the European Economic Community.

Britain’s policy towards Europe has always been to prevent the emergence of a Third Force independent of US-led NATO. Charles de Gaulle saw this clearly, vetoing Britain’s first application to join the EEC in 1963 in order to prevent an American “Trojan Horse” in Europe.

Although Prime Minister Tony Blair wanted Britain to be at “the heart” of Europe, Britain pursued the same game inside the EU from 1974 until 2021. The only really European-minded prime minister in this period was Edward Heath. Otherwise, British governments have sought to maximize the benefits to Britain of trade and tourism, while minimizing the dangers of political contamination. Today, it is not surprising that Britain joins the US to project NATO power in Eastern Europe over the stricken torso of the EU itself.

So, Britain is left with just two circles. In the wake of Brexit, the Queen’s legacy is clear. Through her official position and personal qualities, she preserved the Commonwealth as a possible vehicle for projecting what remains of Britain’s hard power, such as military alliances in the South Pacific. And whatever one may think of Britain’s hard power, its soft power – reflecting its trading relationships, its cultural prestige in Asia and Africa, and its multicultural ideal – is a global public good in an age of growing ethnic, religious, and geopolitical conflict.

I doubt whether the two remaining circles can compensate for Britain’s absence from the third. The question that remains to be answered is how much the Commonwealth’s durability depended on the sheer longevity of the late monarch, and how much of it can be preserved by her successor.

Mind the Policy Gaps

Aug 22, 2022 ROBERT SKIDELSKY

The widening gaps in policy formation nowadays reflect the division of labor and increasing specialization that has taken us from the sixteenth-century ideal of the Renaissance man. And today’s biggest policymaking gap has grown so large that it threatens global catastrophe.

LONDON – Just as the insistent demand for more “transparency” is a sure sign of increasing opacity, the current clamor for “joined-up thinking” indicates that the need for it far outstrips the supply. With its recent report on energy security, the UK House of Lords Economic Affairs Committee has added its voice to the chorus.

The report’s language is restrained, but its message is clear: Without a “joined-up” energy policy, the United Kingdom’s transition to net zero by 2050 will be “disorderly” (read: “will not happen”). For example, the policy aimed at improving home insulation is at odds with local authorities’ listed-building regulations.

In April, the government called on the Bank of England and financial regulators to “have regard to” energy security. What does this mean? Which institution is responsible for which bits of energy security? How does energy security relate to the net-zero goal? Never mind gaps in the data: The real problem is yawning chasms in thinking.

In a masterly understatement, the committee’s report says that “The Russian invasion of Ukraine has created global energy supply issues.” In fact, economic sanctions against the invader have contributed significantly to a massive energy and food crisis that threatens the sanctioning countries with stagflation and many people in developing economies with starvation.1

China provides key minerals for renewable-energy technologies, including wind turbines and solar cells, and supplies 66% of finished lithium-ion batteries. The report concludes that the UK must “not become reliant on strategic competitors, notably China, for critical minerals and components.” Moreover, the government “will need to ensure that its foreign and trade policies […] and its policy on net zero are aligned.” Quite a bit more joining up to do, then.

The widening gaps in policy formation reflect the increasing division of labor resulting from the relentless march of complexity. Today’s policymakers and their advisers know more and more about less and less, recalling Adam Smith’s description in The Wealth of Nations of a pin factory worker:

“The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same, has no occasion to exert his understanding, or to exercise his invention in finding out expedients for removing difficulties which never occur. He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become.”

No one in Smith’s factory would need to know how to make a pin, or even what the purpose of producing one was. They would know only how to make a part of a pin. Likewise, the world is becoming full of “experts” who know only little bits of their subject.

The ideal of the “Renaissance man,” who could do a lot of joined-up thinking, did not survive the growing division of labor. By the eighteenth century, knowledge was being split up into “disciplines.” Now, subdisciplines sprout uncontrollably, and communication of their findings to the public is left to journalists who know almost nothing about everything.

Today’s biggest gap, one so large that it threatens catastrophe, is between geopolitics and economics. Foreign ministries and Treasuries no longer talk to each other. They inhabit different worlds, use different conceptual languages, and think about different problems.

The geopolitical world is divided up into “strategic partners” and “strategic rivals.” Borders are alive and well. States have conflicting national interests and pursue national-security policies. Economics, in contrast, is the science of a single market: Its ideal is economic integration across frontiers and a global price mechanism that automatically harmonizes conflicting preferences. Economics also tells us that commerce softens the asperities of politics, creating over time a single community of learning and culture.

The eighteenth-century English historian Edward Gibbon described history as “little more than the register of the crimes, follies, and misfortunes of mankind.” But the end of the Cold War led to hopes that the world was at last growing up; in 1989, the American academic Francis Fukuyama proclaimed the “end of history.”

Today, however, geopolitics is back in the saddle. The West regards Russia as a pariah because of its invasion of Ukraine; China, if not yet quite meriting that status, is a strategic rival with which trade in many areas should be shunned. At the same time, Western governments also insist on the need for global cooperation to tackle potentially lethal climate change and other existential dangers like nuclear proliferation and pandemics.

In 2012, for example, the University of Cambridge established the Centre for the Study of Existential Risk to help mitigate threats that could lead to humanity’s extinction. Sixty-five years earlier, atomic scientists from the Manhattan Project founded a bulletin to warn humanity of the possible negative consequences arising from accelerating technological advances. They started a Doomsday Clock, the time of which is announced each January. In 1947, the clock was set at seven minutes to midnight. Since January 2020, it has stood at 100 seconds to midnight, marking the closest humanity has come to Armageddon in 75 years.

The danger of nuclear proliferation, which prompted the clock’s creation, is still very much with us. As the bulletin points out, “Development of hypersonic glide vehicles, ballistic missile defenses, and weapons-delivery systems that can flexibly use conventional or nuclear warheads may raise the probability of miscalculation in times of tension.” Do those who urge the expulsion of Russian forces from Ukraine, by economic or military means, consider this risk? Or is this another “gap” in their knowledge of the world?

It is a tragedy that the economic basis of today’s world order, such as it is, is now being put at risk. That risk is the result of actions for which all the great powers share responsibility. The ironically named United Nations, which exists to achieve planetary security, is completely marginalized. Its virtual absence from the big scenes of international conflict is the biggest gap of all, and one which jeopardizes our common future.

Boris Johnson’s Fall – and Ours

Jul 19, 2022 ROBERT SKIDELSKY

Although words like “unprincipled,” “amoral,” and “serial liar” seem to describe the outgoing British prime minister accurately, they accurately describe more successful political leaders as well. To explain Johnson’s fall, we need to consider two factors specific to our times.

LONDON – Nearly all political careers end in failure, but Boris Johnson is the first British prime minister to be toppled for scandalous behavior. That should worry us.

The three most notable downfalls of twentieth-century British leaders were caused by political factors. Neville Chamberlain was undone by his failed appeasement policy. The Suez fiasco forced Anthony Eden to resign in 1957. And Margaret Thatcher fell in 1990 because popular resistance to the poll tax persuaded Tory MPs that they could not win again with her as leader. 

True, Harold Macmillan was undone in 1963 by the Profumo sex scandal, but this involved a secretary of state for war and possible breaches of national security. Election defeats following economic failure brought down Edward Heath and James Callaghan in the 1970s. Tony Blair was forced to resign by the Iraq debacle and Gordon Brown’s impatience to succeed him. David Cameron was skewered by Brexit, and Theresa May by her failure to deliver Brexit. 

No such events explain Johnson’s fall. 

David Lloyd George, a much greater leader than Johnson, is his only serious rival in sleaze. But though the sale of seats in the House of Lords, slipshod administrative methods, and dishonesty had weakened Lloyd George, the immediate cause of his fall (exactly a century ago) was his mishandling of the Chanak crisis, which brought Britain and Turkey to the brink of war. 

The more familiar comparison is with US President Richard Nixon. Every Johnson misdemeanor is routinely labeled “gate” after the Watergate break-in that ended Nixon.

John Maynard Keynes called Lloyd George a “crook”; Nixon famously denied that he was one. Neither they nor Johnson were crooks in the technical sense (of being convicted of crimes), but Nixon would have been impeached in 1974 had he not resigned, and Johnson was fined £50 for breaking lockdown rules. Moreover, all three showed contempt for the laws they were elected to uphold, and for the norms of conduct expected from public officials. 

We struggle to describe their character flaws: “unprincipled,” “amoral,” and “serial liar” seem to capture Johnson. But they describe more successful political leaders as well. To explain his fall, we need to consider two factors specific to our times. 

The first is that we no longer distinguish personal qualities from political qualities. Nowadays, the personal really ispolitical: personal failings are ipso facto political failings. Gone is the distinction between the private and the public, between subjective feeling and objective reality, and between moral and religious matters and those that government must address. 

Politics has crossed into the realm previously occupied by psychiatry. This was bound to happen once affluence undermined the old class basis of politics. Questions of personal identity arising from race, gender, sexual preference, and so on now dominate the spaces vacated by the politics of distribution. Redressing discrimination, not addressing inequality, became the task of politics. 

Johnson is both a creature and a victim of identity politics. His rhetoric was about “leveling up” and “our National Health Service.” But, in practice, he made his personality the content of his politics. No previous British leaders would have squandered their moral capital on trivial misdemeanors and attempted cover-ups, because they knew that it had to be kept in reserve for momentous events. But momentous events are now about oneself, so when a personality is seen as flawed, there is no other story to tell. 

Johnson’s personality-as-politics was also the creation of the media. In the past, newspapers, by and large, reported the news; now, focusing on personalities, they create it. This change has given rise to a corrupt relationship: personalities use the media to promote themselves, and the media expose their frailties to sell copy. 

There has always been a large market for sexual and financial gossip. But even in the old “yellow press,” there was a recognized sphere of public events that took priority. Now the gossip stories are the public events. 

This development has radically transformed public perceptions about the qualities a political leader should have. Previous generations of political leaders were by no means all prudes. They lied, drank, fornicated, and took bribes. But everyone concerned with politics recognized that it was important to protect the public sphere. Leaders’ moral failings were largely shielded from scrutiny, unless they became egregious. And even when the public became aware of them, they were forgiven, provided the leaders delivered the goods politically. 

Most of the offenses that led to Johnson’s resignation would never have been reported in the past. But today the doctrine of personal accountability justifies stripping political leaders naked. Every peccadillo, every lapse from correct expression, becomes a credibility-destroying “disgrace” or “shame.” People’s ability to operate in the public sphere depends on privacy. Once that is gone, their ability to act effectively when they need to vanishes. 

The other new factor is that politics is no longer viewed as a vocation so much as a stepping stone to money. Media obsession with what a political career is worth, rather than whether politicians are worthy of their jobs, is bound to affect what politically ambitious people expect to achieve and the public’s view of what to expect from them. Blair is reported to have amassed millions in speaking engagements and consultancies since leaving office. In keeping with the times, The Times has estimated how much money Johnson could earn from speaking fees and book deals, and how much more he is worth than May. 

In his resignation speech, Johnson sought to defend the “best job in the world” in traditional terms, while criticizing the “eccentricity” of being removed in mid-delivery of his promises. But this defense of his premiership sounded insincere, because his career was not a testimony to his words. The cause of his fall was not just his perceived lack of morality, but also his perceived lack of a political compass. For Johnson, the personal simply exposed the hollowness of the political.

Russia’s Path to Premodernity

Jun 14, 2022 ROBERT SKIDELSKY

The Stalinist retreat from science and logic persisted following the Soviet Union’s collapse and is now the main tendency of Russian President Vladimir Putin’s rule. With his faith-based mythology, warping of history, and denial of facts, Putin’s withdrawal from contemporary Europe could not be starker.

LONDON – The Russian writer Pyotr Chaadayev said of his country that “we have never advanced along with other people; we are not related to any of the great human families; we belong neither to the West nor to the East, and we possess the traditions of neither. Placed, as it were, outside of the times,” he wrote, “we have not been affected by the universal education of mankind.”

That was in 1829. The “riddle, wrapped in a mystery, inside an enigma,” as Winston Churchill described Russia more than a century later, is no closer to being solved today. The philosopher John Gray recently wrote that Russian President Vladimir Putin “is the face of a world the contemporary Western mind does not comprehend. In this world, war remains a permanent part of human experience; lethal struggles over territory and resources can erupt at any time; human beings kill and die for the sake of mystical visions.” That is why Western commentators and liberal Russians are baffled by Putin’s so-called “special military operation” in Ukraine.

Personality-based explanations for Putin’s actions are the easiest to advance – and the most facile. Putin is neither acting like an expert chess player, calculating every move, nor like a ruler unhinged by power or steroids.

Rather, Putin has a distorted, or at least one-sided, view of Russian history, and of what constitutes Russia’s special virtue. But this does not explain the widespread popular and intellectual support in Russia for his justificatory narrative regarding Ukraine. We are all to some extent captives of our national myths. It is just that Russian mythology is out of step with “the universal education of mankind.”

We expect Russia to behave more or less like a modern, or even postmodern, European nation-state, but forget that it missed out on three crucial ingredients of European modernization. First, as Yuri Senokosov has written, Russia never went through the Reformation or had its age of Enlightenment. This, Senokosov argues, is because “serfdom was abolished only in 1861 and the system of Russian autocracy collapsed only in 1917 […] It was then swiftly restored.” As a result, Russia never experienced the period of bourgeois civilization which, in Europe, established the outlines of the constitutional state.

Second, Russia was always an empire, never a nation-state. Autocracy is its natural form of rule. To its current czar, the disintegration of the Soviet Union in 1991 was a violation of Russian history.

The third missing ingredient, related to the absence of the first two, was liberal capitalism, of which Russia had only brief and limited experience. Marx insisted that the capitalist phase of economic development had to precede socialism, because any attempt to build an industrial economy on the archaic soil of peasant primitivism was bound to lead to despotism.

Yet, this is exactly what Lenin’s revolutionary formula of “Soviet power plus the electrification of the whole country” amounted to. Lenin, a brilliant opportunist, was following in the tradition of the great reforming czars who tried to westernize Russian society from the top. Peter the Great demanded that Russian men shave their beards and instructed his boyars: “Don’t gorge like a pig; don’t clean your teeth with a knife; don’t hold bread to your chest while cutting it.”

In the nineteenth century, Russia’s relationship with Europe took on a new dimension with the idea of the New Man – a Western type inextricably linked to Enlightenment philosophy and enthusiastic about science, positivism, and rationality. He appears as Stoltz in Ivan Goncharov’s 1859 novel Oblomov. In Ivan Turgenev’s Fathers and Sons (1862), he is the nihilist “son” Bazarov, who champions science and rails against his family’s irrational traditions. Nikolai Chernyshevsky’s novel What Is to Be Done? (1863), which strongly influenced Lenin, imagines a society of glass and steel built on scientific reason.

Because of their shallow roots in Russian culture, these futuristic projections incited a literary peasants’ revolt. Fyodor Dostoevsky’s Notes from the Underground, published in 1864, not only became one of the canonical texts of Christian Slavophilia, but also raised profound questions about modernity itself.

The Bolsheviks made the greatest collective attempt to bring the New Man out of literature and into the world. They, like Peter the Great, understood that transforming a society required transforming the people in it. They launched a concerted effort, with the participation of the foremost avant-garde artists of the time, to modernize people’s mindsets and nurture their revolutionary consciousness. Russians would become the scientifically and collectively minded New Men who would help build the Communist Utopia.

This was perhaps the biggest failure of all. With Stalin deeming socialism achieved in 1936, and state-mandated socialist realist literature and art exalting mysticism over science, Soviet dreams of a New Man remained just that. The retreat from science and logic survived the Soviet Union’s collapse and now is the animating tendency of Putin’s rule. His own faith-based mythology, unusual symbiotic relationship with the Orthodox Patriarch Kirill of Moscow, warping of history, and denial of facts, underscore the extent of Russia’s withdrawal from contemporary Europe.

In his 2003 book The Breaking of Nations, the former European Union diplomat Robert Cooper thought Russia’s future was still open. The signing of the Conventional Armed Forces in Europe Treaty and later Russian moves to join NATO indicated that “postmodern elements” were “trying to get out.” Whether the rapprochement was foiled by Western arrogance or Russian incompatibility will long be debated. By 2004, Putin had shed most of his liberalizing tendencies and began embracing traditionalism. In Cooper’s classification, Russia is a modern pre-modern state.

Following the Soviet Union’s 1968 invasion of Czechoslovakia, the Czech writer Milan Kundera refused to adapt Dostoevsky’s The Idiot for the stage. “Dostoevsky’s universe of overblown gestures, murky depths, and aggressive sentimentality repelled me,” Kundera said. It is in these murky depths, behind the rational façade, that we can glimpse Putin’s war.

The Guardian view on a four-day week: policies needed to make it a reality

After the first world war, workers wanted a peace dividend for their sacrifices. Within three years they got it. Almost every industrialised nation – with the exception of Japan – accepted the newly established International Labour Organization’s call to limit working hours to eight a day and 48 a week. While most developed countries enacted legislation to achieve these aims, Britain, along with the United States and Italy, did so through collective agreements.

Today, the triple crises of Covid, Russia’s war in Ukraine and Brexit will create job-altering shocks. Employers are already implementing remote working. Some workers, perhaps those with comfortable homes, prefer online messaging to water cooler chats and web conference calls to in-person ones. Others, meanwhile, are opting out of work altogether. From Monday, thousands of workers in 70 UK companies will be paid the same wages for a four-day working week as a five-day one. Like an eight-hour day in 1919, workers are demanding changes once regarded as fringe, eccentric ideas.

There are good arguments for a four-day week. Studies suggest improvements in workers’ happiness and improvements in productivity. The UK has for too long fostered a working culture that encourages long hours and employee exhaustion. About 10 million people – almost one in three people in work – would work fewer hours if they could. Remarkably, 3 million of them would take fewer hours even with a loss in pay.

Having to work less hard for a desired income is obviously welcome. But such a desirable outcome is complicated by factors such as the pressure to consume, security of employment and inequalities of power and income. Given the prevalence of in-work poverty, and with inflation hitting those on lowest incomes the hardest, many British workers cannot afford to cut their hours.

In 2019, the economist Lord Skidelsky considered the problem in detail for the Labour party. His report compared how European countries had managed to make employment more compatible with wellbeing. He noted, with approval, how collective bargaining in Germany had seen workers receive real wage increases and reductions in working hours in return for improved productivity. He rejected a French-style legislated national limit, noting that it broke down within a few years.

The peer’s insight was that the economic security and rights of UK workers had to be improved so that they were in “a position to decrease their working hours voluntarily should they wish to”. In the modern age, it is clear that the market cannot provide continuous full employment. That is why Lord Skidelsky advocated for a new role for the government as an “employer of last resort”, by guaranteeing jobs paying the living wage to the unemployed who cannot find work in the private sector.

By providing an alternative to the market, argued the peer, the state would gain a powerful lever to push down the average number of hours worked. Lord Skidelsky thought that a 35-hour working week in the public sector over 10 years was achievable with the right policies. Britain’s experience a century ago is worth recalling. The loss in output from cutting working time was largely offset by increased hourly productivity. The shorter day led to the growth of leisure and consumer industries. Currently, the financial logic that governs the rules of employment is inimical to reducing workloads. What is needed are countervailing institutions to push society in the technologically possible direction desired by most people.

The Case for Nordic and NATO Realism

To be a realist in international relations is to accept that some states are more sovereign than others. “Strict realism” now requires that Sweden and Finland pause before rushing into NATO’s arms, and that the Alliance take a step back before accepting them.

LONDON – Finland and Sweden have announced that they will apply for NATO membership. But joining the Alliance is more likely to weaken than enhance their security and that of Europe.

Strategic neutrality has preserved Sweden’s independence and freedom from war for 200 years, and Finland’s independence since 1948. Has anything happened to justify ending it?

Swedish and Finnish officials point to two episodes. In December 2021, the Kremlin went from desiring Swedish and Finnish neutrality to, in essence, demanding it, sending a clear and threatening message that an independent foreign policy is a privilege, not a right, for Russia’s neighbors. More important, Russia’s invasion of Ukraine has fundamentally worsened the two countries’ security environment by increasing the risk that Russia will attack or seek to intimidate them. Since they cannot hope to defeat Russia in battle, singly or jointly, they must join an organization that can.

In expert-speak, NATO membership will “raise the threshold of deterrence.” Faced with the certainty of retaliation (including nuclear, if necessary), Russia will desist from attacking, or seriously bullying, Sweden and Finland. This argument strongly implies that, had Ukraine been a NATO member, Russia would not have invaded it, since, as the Swedish foreign and defense ministries point out, “Russia (or the Soviet Union) has never attacked a NATO ally.” But Sweden and Finland’s efforts to strengthen deterrence might be self-defeating, because NATO enlargement could raise the threshold of Russia’s willingness to invade them, at least before they become Alliance members.

Judging the wisdom of further NATO enlargement requires taking a view on two matters. First, is Russia’s invasion of Ukraine (however unjustified in law and brutal in execution) evidence of a general expansionary intent, or is it sui generis? Second, what responsibilities for maintaining peace fall on small countries that abut big countries?

History offers some guidance on both questions. After 1945, Stalin could have absorbed Finland into the Soviet Union, or ruled it through a puppet. Finland had been crushed in a war in which it fought on the side of the Germans – something Finns don’t like to be reminded of, though their alliance with Hitler came about only following Stalin’s 1939 invasion.

Still, Stalin was never interested in restoring Czarist rule over Finland. His concern was strategic. As Stalin said in 1940 following the Soviet Union’s “Winter War” with Finland, “we can’t move Leningrad, [so] we must move the borders.” What he demanded, and eventually got, was some 10% of Finnish territory, including a big slice of Karelia near Leningrad (now St. Petersburg), plus some strategic islands.

After this land grab, Stalin guaranteed Finnish independence in the 1948 Agreement of Friendship, Cooperation, and Mutual Assistance, on condition that Finland promised to “fight to repel” any attack on the Soviet Union “through Finnish territory,” with help from the Kremlin if Finland agreed. Unlike the Soviet Union’s Eastern European satellite states, Finland was not required to join the Warsaw Pact when it was established in 1955.

There is a superficial parallel between Ukraine’s current tragedy and Finland circa 1939-48. Stalin made Finnish neutrality a condition of its independence, while Russian President Vladimir Putin claims that his main demand is that Ukraine renounce the goal of NATO membership.

But the differences between the two cases are greater. Although part of the Czarist empire, Finland was never part of “historic” Russia as Ukraine was, and contained no large Russian minorities. Putin regards Ukraine as an “inalienable” part of Russia, and blames Lenin’s establishment of a Ukrainian Soviet Socialist Republic for creating Ukrainian nationalism. So, while strategic considerations may have been uppermost in Stalin’s mind, it is reasonable to suppose – as Ukrainians and Ukraine’s Western supporters do – that Putin is using the threat of NATO expansion as an excuse to undo what he sees as Lenin’s historic mistake.

If Russia’s fear of NATO is genuine, Sweden and Finland’s membership applications will expose them to the risk of retaliation before they join, and it is at least debatable as to whether a NATO Article 5 guarantee will offer greater real security than neutrality does. If the Russia-Ukraine war is specific to Russian history, with NATO expansion only an excuse, it cannot be seen as a prelude to unlimited territorial expansion, though Putin’s remarks belittling Kazakhstan’s statehood are worryingly similar to his denials of Ukraine’s right to exist. Either way, the case for Swedish and Finnish NATO membership is not open and shut.

This brings us to the second matter, small countries’ responsibilities for peace. The former European Union diplomat Robert Cooper argues in his book The Ambassadors that “strict realism [is] required by small states with big neighbors.” And it is realism that seems to be lacking in the Swedish and Finnish governments’ current policy thinking. Consider the Swedish foreign and defense ministries’ assertion that “The Russian leadership operates based on … a view of history that differ[s] from th[at] of the West,” including “the aim of creating spheres of influence.”1

Attributing that Russian conception simply to totalitarian thinking amounts to a denial of any special obligation of a state to its people arising from its location in the international system – the reverse of Cooper’s “strict realism.” The doctrine of spheres of influence may be alien to today’s international norms, but not to international practice. No powerful state wants a potential enemy on its doorstep. This was (and remains) the basis of the US Monroe Doctrine vis-à-vis the Western Hemisphere. It is supposedly the basis of Russia’s strategic doctrine, though in practice Russia has preferred to have vassal states on its borders.

To be a realist in international relations is to accept that some states are more sovereign than others. The Finns acknowledged this after World War II. “Strict realism” now requires that Sweden and Finland pause before rushing into NATO’s arms, and that the Alliance take a step back before accepting them. Ukraine, whose brave resistance has set the limits on Russia’s territorial expansion, also must now be willing to negotiate some form of peaceful coexistence with its more powerful neighbor.