The British Empire comprised the dominions, colonies, protectorates, mandates, and other territories ruled or administered by the United Kingdom and its predecessor states. It originated with the overseas possessions and trading posts established by England between the late 16th and early 18th centuries.
British Empire, a worldwide system of dependencies—colonies, protectorates, and other territories—that over a span of some three centuries was brought under the sovereignty of the crown of Great Britain and the administration of the British government.
The policy of granting or recognizing significant degrees of self-government by dependencies, which was favoured by the far-flung nature of the empire, led to the development by the 20th century of the notion of a “British Commonwealth,” comprising largely self-governing dependencies that acknowledged an increasingly symbolic
British sovereignty. The term was embodied in statute in 1931. Today the Commonwealth includes former elements of the British Empire in a free association of sovereign states.
Origins of the British Empire
Maritime expansion, driven by commercial ambitions and by competition with France, accelerated in the 17th century and resulted in the establishment of settlements in North America and the West Indies
By 1670 there were British American colonies in New England, Virginia, and Maryland and settlements in the Bermudas, Honduras, Antigua, Barbados, and Nova Scotia. Jamaica was obtained by conquest in 1655, and the Hudson’s Bay Company established itself in what became northwestern Canada from the 1670s on.
The East India Company began establishing trading posts in India in 1600, and the Straits Settlements (Penang, Singapore, Malacca, and Labuan) became British through an extension of that company’s activities.
The first permanent British settlement on the African continent was made at James Island in the Gambia River in 1661. Slave trading had begun earlier in Sierra Leone, but that region did not become a British possession until 1787.
Britain acquired the Cape of Good Hope (now in South Africa) in 1806, and the South African interior was opened up by Boer and British pioneers under British control
Nearly all these early settlements arose from the enterprise of particular companies and magnates rather than from any effort on the part of the English crown. The crown exercised some rights of appointment and supervision, but the colonies were essentially self-managing enterprises.
The formation of the empire was thus an unorganized process based on piecemeal acquisition, sometimes with the British government being the least willing partner in the enterprise.
In the 17th and 18th centuries, the crown exercised control over its colonies chiefly in the areas of trade and shipping. In accordance with the mercantilist philosophy of the time, the colonies were regarded as a source of necessary raw materials for England and were granted monopolies for their products, such as tobacco and sugar, in the British market.
In return, they were expected to conduct all their trade by means of English ships and to serve as markets for British manufactured goods. The Navigation Act of 1651 and subsequent acts set up a closed economy between Britain and its colonies; all colonial exports had to be shipped on English ships to the British market, and all colonial imports had to come by way of England.
This arrangement lasted until the combined effects of the Scottish economist Adam Smith’s Wealth of Nations (1776), the loss of the American colonies, and the growth of a free-trade movement in Britain slowly brought it to an end in the first half of the 19th century.
The slave trade acquired a peculiar importance to Britain’s colonial economy in the Americas, and it became an economic necessity for the Caribbean colonies and for the southern parts of the future United States.
Movements for the end of slavery came to fruition in British colonial possessions long before the similar movement in the United States; the trade was abolished in 1807 and slavery itself in Britain’s dominions in 1833.
Competition with France
British military and naval power, under the leadership of such men as Robert Clive, James Wolfe, and Eyre Coote, gained for Britain two of the most important parts of its empire—Canada and India. Fighting between the British and French colonies in North America was endemic in the first half of the 18th century, but the Treaty of Paris of 176
3, which ended the Seven Years’ War (known as the French and Indian War in North America), left Britain dominant in Canada. In India, the East India Company was confronted by the French Compagnie des Indes, but Robert Clive’s military victories against the French and the rulers of Bengal in the 1750s provided the British with a massive accession of territory and ensured their future supremacy in India.
The loss of Britain’s 13 American colonies in 1776–83 was compensated by new settlements in Australia from 1788 and by the spectacular growth of Upper Canada (now Ontario) after the emigration of loyalists from what had become the United States. The Napoleonic
Wars provided further additions to the empire; the Treaty of Amiens (1802) made Trinidad and Ceylon (now Sri Lanka) officially British, and in the Treaty of Paris (1814) France ceded Tobago, Mauritius, Saint Lucia, and Malta.
Malacca joined the empire in 1795, and Sir Stamford Raffles acquired Singapore in 1819. Canadian settlements in Alberta, Manitoba, and British Columbia extended British influence to the Pacific, while further British conquests in India brought in the United Provinces of Agra and Oudh and the Central Provinces, East Bengal, and Ass/
New Zealand became officially British in 1840, after which systematic colonization there followed rapidly. Partly owing to pressure from missionaries, British control was extended to Fiji, Tonga, Papua, and other islands in the Pacific Ocean, and in 1877 the British High Commission for the Western Pacific Islands was created.
In the wake of the Indian Mutiny (1857), the British crown assumed the East India Company’s governmental authority in India. Britain’s acquisition of Burma (Myanmar) was completed in 1886, while its conquest of the Punjab (1849) and of Balochistān (1854–76) provided substantial new territory in the Indian subcontinent itself.
The French completion of the Suez Canal (1869) provided Britain with a much shorter sea route to India. Britain responded to this opportunity by expanding its port at Aden, establishing a protectorate in Somaliland (now Somalia), and extending its influence in the sheikhdoms of southern Arabia and the Persian Gulf.
Cyprus, which was, like Gibraltar and Malta, a link in the chain of communication with India through the Mediterranean, was occupied in 1878. Elsewhere, British influence in the Far East expanded with the development of the Straits Settlements and the federated Malay states, and in the 1880s protectorates were formed over Brunei and Sarawak.
Hong Kong island became British in 1841, and an “informal empire” operated in China by way of British treaty ports and the great trading city of Shanghai.
The greatest 19th-century extension of British power took place in Africa, however. Britain was the acknowledged ruling force in Egypt from 1882 and in the Sudan from 1899. In the second half of the century, the Royal Niger Company began to extend British influence in Nigeria, and the Gold Coast (now Ghana) and The Gambia also became British possessions.
The Imperial British East Africa Company operated in what are now Kenya and Uganda, and the British South Africa Company operated in what are now Zimbabwe (formerly Southern Rhodesia), Zambia (formerly Northern Rhodesia), and Malawi.
Britain’s victory in the South African War (1899–1902) enabled it to annex the Transvaal and the Orange Free State in 1902 and to create the Union of South Africa in 1910. The resulting chain of British territories stretching from South Africa northward to Egypt realized an enthusiastic British public’s idea of an African empire extending “from the Cape to Cairo.”
By the end of the 19th century, the British Empire comprised nearly one-quarter of the world’s land surface and more than one-quarter of its total population.
The idea of limited self-government for some of Britain’s colonies was first recommended for Canada by Lord Durham in 1839. This report proposed “responsible self-government” for Canada, so that a cabinet of ministers chosen by the Canadians could exercise executive powers instead of officials chosen by the British government.
The cabinet would depend primarily on support by the colonial legislative assembly for its tenure of ministerial office. Decisions on foreign affairs and defense, however, would still be made by a governor-general acting on orders from the British government in London.
The system whereby some colonies were allowed largely to manage their own affairs under governors appointed by the mother country spread rapidly.
In 1847 it was put into effect in the colonies in Canada, and it was later extended to the Australian colonies, New Zealand, and to the Cape Colony and Natal in southern Africa.
These colonies obtained such complete control over their internal affairs that in 1907 they were granted the new status of dominions. In 1910 another dominion, the Union of South Africa, was formed from the Cape Colony, Natal, and the former Boer republics of the Transvaal and the Orange Free State.
This select group of nations within the empire, with substantial European populations and long experience of British forms and practices, was often referred to as the British Commonwealth.
The demands and stresses of World War I and its aftermath led to a more formal recognition of the special status of the dominions. When Britain had declared war on Germany in 1914 it was on behalf of the entire empire, the dominions as well as the colonies. But after World War I ended in 1918, the dominions signed the peace treaties for themselves and joined the newly formed League of Nations as independent states equal to Britain.
In 1931 the Statute of Westminster recognized them as independent countries “within the British Empire, equal in status” to the United Kingdom. The statute referred specifically to the “British Commonwealth of Nations.” When World War II broke out in 1939, the dominions made their own declarations of war.
The rest of the British Empire consisted for the most part of colonies and other dependencies whose predominant indigenous populations had no such experience. For them a variety of administrative techniques was tried, ranging from the sophisticated Indian Civil Service, with its largely effective adoption of native practices in civil law and administration,
to the very loose and indirect supervision exercised in a number of African territories, where settlers and commercial interests were left much to themselves while native Africans were segregated into “reserves.”
Nationalism and the Commonwealth
Nationalist sentiment developed rapidly in many of these areas after World War I and even more so after World War II, with the result that, beginning with India in 1947, independence was granted them, along with the option of retaining an association with Great Britain and other former dependencies in the Commonwealth of Nations (the adjective “British” was not used officially after 1946). Indian and Pakistani independence was followed by that of Ceylon (now Sri Lanka) and Burma (Myanmar) in 1948.
The Gold Coast became the first sub-Saharan African colony to reach independence (as Ghana) in 1957. The movement of Britain’s remaining colonies in Africa, Asia, and the Caribbean toward self-government gained speed in the years after 1960 as international pressure mounted (especially at the United Nations), as the notion of independence spread in the colonies themselves, and as the British public, which was no longer actively imperial in its sentiments, accepted the idea of independence as a foregone conclusion.
The last significant British colony, Hong Kong, was returned to Chinese sovereignty in 1997. By then, virtually nothing remained of the empire. The Commonwealth, however, remained a remarkably flexible and durable institution. See also colonialism
Withdrawal from the empire
Britain, not entirely by coincidence, was also beginning its withdrawal from the empire. Most insistent in its demand for self-government was India. The Indian independence movement had come of age during World War I and had gained momentum with the Massacre of Amritsar of 1919.
The All-India Congress Party, headed by Mohandas K. Gandhi, evoked sympathy throughout the world with its policy of nonviolent resistance, forcing Baldwin’s government in the late 1920s to seek compromise.
The eventual solution, embodied in the Government of India Act of 1935, provided responsible government for the Indian provinces, the Indianization of the civil service, and an Indian parliament, but it made clear that the Westminster Parliament would continue to legislate for the subcontinent.
The act pleased no one, neither the Indians, the Labour Party, which considered it a weak compromise, nor a substantial section of the Conservative Party headed by Churchill, which thought it went too far. Agitation in India continued.
Further British compromise became inevitable when the Japanese in the spring of 1942 swept through Burma to the eastern borders of India while also organizing in Singapore a large Indian National Army and issuing appeals to Asian nationalism.
During the war, Churchill reluctantly offered increasing installments of independence amounting to dominion status in return for all-out Indian support for the conflict. These offers were rejected by both the Muslim minority and the Hindu majority.
The election of a Labour government at the end of World War II coincided with the rise of sectarian strife within India. The new administration determined with unduly urgent haste that Britain would have to leave India. This decision was announced on June 3, 1947, and British administration in India ended 10 weeks later, on August 15.
Burma (now Myanmar) and Ceylon (now Sri Lanka) received independence by early 1948. Britain, in effect, had no choice but to withdraw from colonial territories it no longer had the military and economic power to control.
The same circumstances that dictated the withdrawal from India required, at almost the same time, the termination of the mandate in Trans-Jordan, the evacuation of all of Egypt except the Suez Canal territory, and in 1948 the withdrawal from Palestine, which coincided with the proclamation of the State of Israel.
It has been argued that the orderly and dignified ending of the British Empire, beginning in the 1940s and stretching into the 1960s, was Britain’s greatest international achievement.
However, like the notion of national unity during World War II, this interpretation can also be seen largely as a myth produced by politicians and the press at the time and perpetuated since.
The ending of empire was calculated upon the basis of Britain’s interests rather than those of its colonies. National interest was framed in terms of the postwar situation—that is, of an economically exhausted, dependent Britain, now increasingly caught up in the international politics of the Cold War.
What later became known as “decolonization” was very often shortsighted, self-interested, and not infrequently bloody, as was especially the case in Malaysia (where the politics of anticommunism played a central role) and in Kenya.
Conservative government (1951–64)
The last years of Attlee’s administration were troubled by economic stringency and inflation. The pound was sharply devalued in 1949, and a general election on February 23, 1950, reduced Labour’s majority over the Conservative and Liberal parties to only five seats. Attlee himself was in poor health, and Ernest Bevin, formerly the most politically powerful man in the cabinet, had died.
More-radical members of the party, led by Aneurin Bevan, were growing impatient with the increasingly moderate temper of the leadership. On October 25, 1951, a second general election in a House of Commons not yet two years old returned the Conservatives under Churchill to power with a majority of 17 seats.
The Conservatives remained in power for the next 13 years, from October 1951 until October 1964, first under Churchill—who presided over the accession of the new monarch, Queen Elizabeth II, on February 6, 1952, but was forced to resign on account of age and health on April 5, 1955—and then under Churchill’s longtime lieutenant and foreign secretary, Anthony Eden.
Eden resigned in January 1957, partly because of ill health but chiefly because of his failed attempt to roll back the retreat from empire by a reoccupation of the Suez Canal Zone after the nationalization of the canal by the Egyptian president, Gamal Abdel Nasser, in the summer of 1956.
This belated experiment in imperial adventure drew wide criticism from the United States, the British dominions, and indeed within Britain itself. Although it was cut short in December 1956, when UN emergency units supplanted British (and French) troops, the Suez intervention divided British politics as few foreign issues have done since.
Eden was succeeded by his chancellor of the Exchequer, Harold Macmillan. Macmillan remained in office until October 1963, when he too retired because of ill health, to be succeeded by Sir Alec Douglas-Home, then foreign secretary.
In this period of single-party government, the themes were economic change and the continued retreat from colonialism.
Labour interlude (1964–70)
The long Conservative tenure came to an end on October 16, 1964, with the appointment of a Labour administration headed by Harold Wilson, who had been Labour leader only a little more than a year and a half—since the death of the widely admired Hugh Gaitskell. Gaitskell and prominent Conservative R.A.
Butler had been the principal figures in the politics of moderation known as “Butskellism” (derived by combining their last names), a slightly left-of-centre consensus predicated on the recognition of the power of trade unionism, the importance of addressing the needs of the working class, and the necessity of collaboration between social classes.
Although Wilson was thought to be a Labour radical and had attracted a substantial party following on this account, he was in fact a moderate. His government inherited the problems that had accumulated during the long period of Conservative prosperity: poor labour productivity, a shaky pound, and trade union unrest.
His prescription for improvement included not only a widely heralded economic development plan, to be pursued with the introduction of the most modern technology, but also stern and unpopular controls on imports, the devaluation of the pound, wage restraint, and an attempt, in the event these measures proved unsuccessful, to reduce the power of the trade unions.
Eventually the Wilson government became unpopular and was kept in power primarily by weakness and division in the Conservative Party. Finally, in 1968, Wilson was confronted with an outbreak of civil rights agitation in Northern Ireland that quickly degenerated into armed violence.
The return of the Conservatives (1970–74)
The Conservatives returned in a general election on June 18, 1970, with a majority of 32. The new prime minister, Edward Heath, set three goals: to take Britain into the European Economic Community (EEC; ultimately succeeded by the European Union [EU]), to restore economic growth, and to break the power of the trade unions.
In his short term in office he succeeded only in negotiating Britain’s entry into the EEC, in 1973. In fact, Heath was defeated by the trade unions, which simply boycotted his industrial legislation, and by the Arab oil embargo, which began in 1973 and which made a national coal miners’ strike in the winter of 1973–74 particularly effective.
Heath used the strongest weapon available to a prime minister—a general election, on February 28, 1974—to settle the issue of who governed Britain.
The election, held when factories were in operation only three days a week and civilian Britain was periodically reduced to candlelight, was a repudiation of the policy of confrontation with labour.
Labour back in power (1974–79)
Despite losing by more than 200,000 votes to the Conservatives, Labour and Wilson returned as a minority government and promptly made peace by granting the miners’ demands. Wilson’s policies were confirmed on October 10, 1974,
in a second election, when his tiny majority, based upon cooperation from the Scottish National Party and the Plaid Cymru (Welsh Nationalist Party) as well as the Liberals, was increased to an almost workable margin of 20. The Labour government faced severe economic challenges
—including post-World War II record levels of unemployment and inflation—yet Wilson was able to renegotiate British membership in the EEC, which was confirmed in a referendum in June 1975.
However, neither Wilson nor James Callaghan, who succeeded him on April 5, 1976, was able to come to terms with the labour unions, which were as willing to embarrass a Labour government as a Conservative one. Labour’s parliamentary position was precarious, and the party lost its governing majority through a series of by-election defeats and defections.
Labour survived through what became known as the “Lib-Lab Pact,” an agreement between Callaghan and Liberal Party leader David Steel, which lasted until August 1978. Union unrest,
induced by rapidly increasing prices, made the late 1970s a period of almost endless industrial conflict, culminating at the end of 1978 in the “Winter of Discontent,” a series of bitter disputes, which the government seemed unable to control and which angered the voters.
Meanwhile, Labour’s slender majority in the House of Commons eroded with the defection of the Liberal and nationalist parties following the defeat of referenda in Wales and Scotland that would have created devolved assemblies. On March 28, 1979,
Callaghan was forced from office after losing a vote of confidence in the House of Commons by a single vote (310–311), the first such dismissal of a prime minister since MacDonald in 1924
The Margaret Thatcher government (1979–90)
The Falkland Islands War, the 1983 election, and privatization
In the subsequent election, in May 1979, the Conservatives under the leadership of Margaret Thatcher were swept into power with the largest electoral swing since 1945, securing a 43-seat majority.
After an extremely shaky start to her administration, Thatcher achieved popularity by sending the armed forces to expel an Argentine force from the Falkland Islands (see Falkland Islands War) in the spring of 1982, on the strength of which she won triumphant reelection in June 1983, her party capturing nearly 400 seats in the House of Commons and a 144-seat majority.
The opposition Labour Party suffered its worst performance since 1918, winning only 27.6 percent of the vote—only 2.2 percent more than an alliance of the Liberals and the Social Democratic Party, a party formed by Labour defectors.
Riding this wave of success, the Thatcher government proceeded with a thoroughgoing privatization of the economy, most notably the railway system.
Like the accompanying deindustrialization of what had been a manufacturing Britain, this transformation of the transportation infrastructure had immense consequences, resulting in a public transport system that was widely perceived as chaotic and inefficient, as well as in a great increase in private automobile use and in road building.
Thatcher’s advocacy of what eventually became known as neoliberalism was in fact part of a similar international response to changes in the global economy driven by the United States during the presidency of Ronald Reagan (predicated on the free market and supply-side economics), with whom Thatcher formed a strong personal alliance.
Deindustrialization and privatization began to change the face of Britain, one fairly immediate outcome being mass unemployment.
Racial discrimination and the 1981 England riots
Partly in response to this development but also prompted by long-simmering tensions, a series of disturbances broke out in British cities in 1981, particularly in Liverpool and London, when an endemically unprivileged young black urban population turned
its sense of alienation from much of British society against the police. Since the Notting Hill race riots of 1958 in London, the integration of the immigrant West Indian community into British society had been a major problem.
This problem worsened with the arrival, beginning in the 1960s, of South Asian immigrants from East Africa and the Indian subcontinent, who, like the Caribbean population, were highly concentrated in particular areas of the country and of cities.
Elements in the Conservative Party, led by Enoch Powell, were not averse to creating political capital out of this situation, though Powell’s English patriotism was more complex than most Conservative gut reactions.
His liberal economics, along with the advocacy of the free market by Keith Joseph, was very influential on the party, especially on Thatcher.
Despite promises to alleviate the urban poverty of immigrant communities, little was done in the 1980s, and in the 1990s the exclusion of blacks and to a lesser extent South Asians from an equal share in the benefits of British society continued to be a critical problem, one which politicians confronted reluctantly and to limited effect.
The 2001 England riots
This was evident earlier in the very limited nature of the Race Relations Act of 1965, itself fiercely opposed by the Conservatives. A subsequent amendment, in 1968, outlawed discrimination in areas such as employment and the provision of goods and services. However, it was not until the Race Relations Act of 1976 that any real change was evident.
This act made both direct and indirect discrimination an offense and provided legal redress for those discriminated against through employment tribunals and the courts. Yet another amendment to the act, in 2001, included public bodies, particularly local authorities and the police, whose role in black communities continued to be a considerable source of tension.
This unease was compounded by endemic inequality and deprivation in ethnic (especially Asian) communities. In 2001 the result was a wave of public disturbances across the north of England, in which disaffected youth once again played a leading role. In Britain,
in the aftermath of the September 11 attacks on the United States, the advent of the so-called “war on terror” served to deepen existing divisions by giving “racial” tensions a new form, that of “Islamophobia.”
The “Troubles” in Northern Ireland
A considerable degree of reluctance also characterized the other great problem of the Thatcher administrations, namely the conflict in Northern Ireland. Since 1945 successive British governments failed to address discrimination against Catholics in Northern Ireland.
The international civil rights current of the late 1960s triggered a new and intensive wave of protest in Northern Ireland, which was met by a continuing reluctance to reform and by police overreaction.
Into this increasingly explosive situation stepped the Provisional Irish Republican Army (IRA), which had separated from the long-established “Official” IRA in 1969 and which gained support after 13 Roman Catholic civil rights demonstrators were killed by British troops in Londonderry on January 30, 1972, an event that became known as Bloody Sunday.
The IRA mounted an increasingly violent campaign against the British Army in Ulster, taking their activity to the British mainland with increasing effect in the 1970s. The so-called “Troubles” ensued for the better part of three decades, with the British Army and the IRA fighting to a vicious draw in the end.
The Troubles also took the form of sectarian strife in Northern Ireland, polarizing the Protestant and Catholic communities, each of which had its own paramilitary organizations.
The IRA “hunger strikers” of the early 1980s failed to move Thatcher, a resistance that probably ultimately harmed her by producing great sympathy for the republican cause in Northern Ireland.
Nor did she appear to be moved by the bombing at the Conservative Party conference in Brighton in 1984, an attempt on her own life that resulted in the deaths of several of her friends and colleagues within the party.
Nonetheless, even at this parlous time, unofficial and secret contacts were being established with the IRA. These led to the very long and tortuous process of negotiation that eventually became known as the “peace process.
Despite being unable to resolve the Irish problem, Thatcher succeeded in 1987 in winning an unprecedented third general election, and in January 1988 she surpassed Asquith as the longest continually serving prime minister since Lord Liverpool (1812–27).
Thatcher’s electoral success came from her extraordinary capacity for leadership and the development of “Thatcherism.” Responding to widespread disillusionment with Labour government and the state, Thatcher was able to tap into, and give leadership to, a politics of freedom and choice that expressed the desires of many people in the 1980s.
In the wake of the debacle that the 1970s had been for the political left and trade union movement, Thatcherism’s variant of contemporary free-market neoliberalism gained increasing momentum.
It effectively ended the postwar accommodation sometimes referred to as the corporate state, through which government, the unions, and business enabled a form of state-managed capitalism to develop. In its movement away from that accord, Britain foreshadowed developments in central and eastern Europe after the demise of communism there in 1989.
Thatcher’s premiership, however, did not survive her third term. She alienated even fellow Conservatives with her insistence on replacing local property taxes with a uniform poll tax and with her unwillingness to fully integrate the pound into a common European currency.
By the end of 1989, voter discontent was manifest in by-elections, and in November 1990 Thatcher faced serious opposition for the first time in the Conservative party’s annual vote for selection of a leader.
When she did not receive the required majority, she withdrew, and John Major, the chancellor of the Exchequer since October 1989, was chosen on November 27. Thatcher resigned as prime minister the following day and was replaced by Major.
.The government of John Major (1990–97)
“Black Wednesday,” epidemic scandals, and Major’s “Citizens Charter”
Despite having presided over the country’s longest recession since the 1930s and owing partly to the Labour Party’s overconfidence, the Conservatives won their fourth consecutive election in April 1992, albeit with a diminished majority of 21 in Parliament.
That they did so was largely a result of the ongoing conflict within Labour as it continued to undergo “modernization.” As the recession lingered, the popularity of Major—and of the Conservatives—plummeted, and the party fared poorly in by-elections and in local elections.
Major’s economic policies were questioned after the “Black Wednesday” fiasco of September 16, 1992, when he was forced to withdraw Britain from the European exchange-rate mechanism and devalue the pound.
Despite having pledged not to increase taxes during the 1992 campaign, Major supported a series of increases to restore Britain’s financial equilibrium.
When he sought to secure passage of the Treaty on European Union in 1993, his grip on power was challenged. Twenty-three Conservatives voted against a government resolution on the treaty, causing the government’s defeat and compelling Major to call a vote of confidence to pass the treaty.
Tory troubles mounted with scandals in local governments, particularly in Westminster in 1994, and thereafter Major was seemingly unable to shake off the growing reputation of his government not only for economic mismanagement but also for corruption and moral hypocrisy.
A seemingly unending series of financial and sexual scandals took their toll, and paper offensives like Major’s “Citizens Charter,” attempting to stop the growing rot of concern about the efficiency and responsibility of privatized industry by laying down citizens’ rights, made little impact.
“Mad cow disease”
As criticism of his leadership mounted within the Conservative Party, Major resigned as party leader in June 1995. In the ensuing leadership election, Major solidified his position—though 89 Conservative members of Parliament voted for his opponent and 22 others abstained or spoiled their ballots.
Major’s government was also severely criticized for its handling of the crisis involving “mad cow disease,” in which it was discovered that large numbers of cattle in the human food supply in Britain were infected with bovine spongiform encephalopathy.
Facing a rejuvenated Labour Party under the leadership of Tony Blair, the Conservatives suffered a crushing defeat in the general election of 1997, winning only 165 seats, their fewest since 1906. Labour’s 419 seats and its 179-seat majority were its largest in British history.
The Tony Blair government (1997–2007)
The struggle for control of Labour
During its years out of power, the Labour Party had undergone a gradual transformation as it attempted to distance itself from the power of the unions on the one hand and the power of the membership on the other, in the guise of the traditional role of the Labour Party Conference.
This process had been started before 1992 by Neil Kinnock, who led the party from 1983 to 1992, and it was continued by his successors, first John Smith and then Blair.
The need for fundamental reappraisal had been urged as early as 1981, with the founding of the Social Democratic Party, when prominent Labour Party politicians, led by Roy Jenkins, seceded from the party in an attempt to “break the mould” of British politics.
Divisions not only between the right and left in the party but also within the left of the party itself added to the chaos that was the British left in the 1980s; the insistence of the radical leftist and former Labour minister Tony Benn on running against the former Labour chancellor Denis Healey in the party election for deputy leadership in 1981
effectively split the radical democratic left and disabled the possibility of an early riposte to Thatcher. It also, ironically enough, contributed to what became known as “New Labour,” rather than a more left-wing variant of labourism eventually replacing the Conservatives
New Labour, the repeal of Clause IV, and the “third way”
The understanding that the party would have to rethink the market (not only in economic but in social terms), embracing it in a way foreign to many of the unions and the traditional Labour left, grew increasingly after 1992, until, after the Labour victory of 1997, there was a clearly marked path for New Labour.
The most symbolically important marker of the change from old to New Labour was the repeal of the party’s Clause IV, engineered by Blair in 1995. The replacement of old Clause IV, which had committed the party to the “common ownership of the means of production,” ended almost 80 years of dedication to that goal.
The new path of the party was to be a middle one, in the phraseology of New Labour, a “third way,” supposedly embracing both social justice and the market.
Not only in rhetoric but in reality, New Labour was to be different from the old. There was also to be increasing attention to the importance of the media, an attention that the Tories had developed into something of a fine art under Thatcher, with her press secretary Bernard Ingham.
Given the increasing role of the media in the presentation of politics and indeed the almost wholesale integration of political substance and political style through the media, this mastery of the art of “spin” was to become a political necessity.
Therefore, art for art’s sake (spin for spin’s sake) was to become a feature of Labour government after 1997. This approach was ultimately to rebound upon the party and, indeed, upon the political process in general during the next decade with the emergence of widespread disillusionment with politics in British society, especially among young people.
Navigating the European monetary system and the EU Social Chapter
Labour’s landslide victory in 1997, which undoubtedly benefited from the inspirational leadership Blair seemed to offer, nevertheless may have been less the result of an unbounded belief in New Labour than of the discrediting of the Conservative Party.
It is certain that Blair was helped into power by the parlous state into which the Conservative Party had fallen under Major after 1992. Promising that “we ran for office as New Labour, and we shall govern as New Labour,” the Blair government in fact began in a rather conservative fashion, by accepting existing government spending limitations.
Nonetheless, the difficult and what came to be the increasingly troubled task of combining aspects of Thatcherism with the idea of a “social market” gathered momentum.
Certainly, through much of Blair’s tenure a buoyant economy, well managed by Chancellor of the Exchequer Gordon Brown, did a great deal to ease the passage of New Labour and the third way.
In his first major initiative and one of his boldest moves, Blair, abetted by Brown, granted the Bank of England the power to determine interest rate policy without government consultation. This was a major move in the disengagement of financial markets from the state.
Blair’s government was also more and more taken up with the question of whether Britain should stay in or remain outside the European monetary union.
At stake were fundamental ideas about British sovereignty and whether, in a progressively globalized world in which some claimed that the individual nation-state was becoming unviable, sovereignty in its existing forms could remain intact.
For the Conservative Party, ever more hostile to the EU, this question was central to its attempts to fight back against the Labour Party.
Blair’s government did sign the Treaty on European Union’s Social Chapter—which sought to harmonize European social policies on issues such as working conditions, equality in the workplace, and worker health and safety—despite Major’s earlier negotiation of an “opt out” mechanism to placate the treaty’s Conservative opponents.
However, the Labour Party’s implementation of the Social Chapter was at best halfhearted, and its goal became to influence as much as possible the EU itself to moderate the operations of the chapter. As with financial deregulation, the emphasis in labour affairs was on the market.
The Good Friday Agreement
Conspicuous progress was also made in solving the problem of Northern Ireland. Under Major, in 1994, the IRA declared a cease-fire, the Protestant paramilitaries followed suit soon after, and talks between the British government, the Irish government, and Sinn Féin began.
The IRA cease-fire secured a long and involved series of negotiations, in which the Good Friday Agreement (Belfast Agreement) of 1998 seemed to have at last brought peace to Northern Ireland. Unionist suspicion and concern about fundamental reforms to the traditional power structure of the province meant,
however, that the implementation of the agreement became a tortuous business. Indeed, it took almost another decade to arrive at what looked like a final resolution, when in 2007 the Northern Ireland Assembly was restored on the basis of power sharing between what had erstwhile been bitter enemies, Sinn Féin and the Ian Paisley-led Democratic Unionist Party.
London’s local government, House of Lords reform, and devolution for Scotland and Wales
In May 1998 voters in London overwhelmingly approved the government’s plan for a new assembly for the city and for its first directly elected mayor, resulting in the capital’s first citywide government since the abolition of the Greater London Council by Thatcher in 1986.
However, the precedent of an elected mayor in London was not subsequently followed by similar action in other major British cities.
In the late 1990s the Labour government also carried out several other constitutional reforms. The House of Lords, previously dominated by hereditary peers (nobles), was reconstituted as an assembly composed primarily of appointive life peers, with only limited representation of hereditary peers.
Nonetheless, the striking contradiction of an unelected legislative assembly in a country that prided itself on its traditions of liberal democracy was apparent.
Following referenda in Wales and Scotland, the National Assembly for Wales and the Scottish Parliament were established in 1999 and granted powers previously reserved for the central government.
Yet, with the exception of political devolution to the component states of the United Kingdom, the Labour Party remained reluctant to reform the constitution, so that at the beginning of the 21st century it was still the revered mysteries of the uncodified British constitution by which the British were governed.
The royal family’s “annus horribilis,” the death of Princess Diana, and the Millennium Dome
The 1990s were a period of transition and controversy for the monarchy. In 1992, during what Queen Elizabeth II referred to as the royal family’s “annus horribilis,” Charles, prince of Wales, heir to the British throne, and his wife, Diana, princess of Wales, separated, as did Elizabeth’s son Andrew, duke of York, and his wife, Sarah, duchess of York.
Moreover, Elizabeth’s daughter, Anne, divorced, and a fire gutted the royal residence of Windsor Castle. After details of extramarital affairs by Charles and Diana surfaced and the couple divorced, observers openly questioned Charles’s fitness to succeed his mother as sovereign, and public support for the monarchy ebbed.
The immensely popular Diana (dubbed the “People’s Princess”) died in an automobile accident in Paris in 1997, prompting an outpouring of grief, or at least hysteria, throughout the world.
The British royal family came under scrutiny for its handling of the matter—especially the queen’s reluctance, because of tradition, to allow the national flag to fly at half-staff over Buckingham Palace.
With the queen celebrating her 50th wedding anniversary, the queen mother, Elizabeth, celebrating her 100th birthday, and Charles working hard to improve his public image, the fortunes of the monarchy improved by the end of the 1990s.
Nevertheless, the established institutions of the British state had been called into question in an unprecedented way. If the popularity of the monarchy survived, it was largely the result of the queen’s persona; the royal family as a whole—itself the idealized media creation of late Victorian times—frequently had become the object of ridicule.
The transformation of the monarchy was indeed emblematic of the very unevenly progressing severance of the British from the long-lived institutions and culture of the 19th century.
To celebrate the new millennium, the monumental Millennium Dome, the largest structure of its kind in the world, and the Millennium Bridge were opened in London.
It was perhaps symbolic of the contradictions of this modernity that the dome was dogged by controversy regarding its cost and design and the bridge by the fiasco of its opening, when it was found to move alarmingly above the waters of the Thames when in public use.
The battle for the soul of the Conservative Party
In June 2001 Blair’s government was reelected with a 167-seat majority in the House of Commons—the largest majority ever won by a second-term British government.
With the question of European integration continuing to be of great significance in British politics, the new Labour administration chose not to adopt the common European currency, the euro, partly because of a fear of popular response.
However, it was on the Conservative side that Britain’s relationship with Europe was most urgently a party issue. It continued to divide a party riven by differences, a party that looked more and more like the Labour Party of the 1980s and early ’90s.
Indeed, there is a direct parallel between the recent histories of the two parties: the traditional left of the Labour Party corresponded to the traditional right of the Conservative Party, as both fought hard to stem the tide of party modernization.
The battle for the soul of the Conservative Party was joined with growing fervour with the election of David Cameron in December 2005 as its modernizing leader. His subsequent attempt to steer the party back to the political centre,
and away from the old order of the Thatcherite legacy, was every bit as difficult as the redirection undertaken by Labour modernizers. In addition to Europe and economic policy, the issue of increased levels of immigration into Britain after 2000 further divided the Conservatives.
Indeed, Britain as a whole became divided on this issue. Large bodies of opinion, stirred up by xenophobia in the popular press, responded with fear and anxiety to increased levels of immigration from central and eastern Europe that were a consequence of European integration.
In a more globalized and war-ridden world, the burgeoning flow of asylum seekers into Britain added to this climate, as did the “war on terror.”
Asian Muslims, many of them long-standing British citizens and British-born, were nonetheless frequently lumped with immigrants and asylum seekers as part of an undifferentiated external threat to Britishness.
Response to the September 11 attacks
Following the September 11 attacks on the United States in 2001, global terrorism dominated the political agenda in Britain, and Blair closely allied himself with the administration of U.S. Pres. George W. Bush.
Britain contributed troops to the military effort to oust Afghanistan’s Taliban regime, which was charged with harbouring Osama bin Laden, who had founded al-Qaeda, the terrorist organization linked to the September 11 attacks.
Although Blair received strong support for his antiterrorist strategy from the Conservatives and Liberal Democrats in the House of Commons, a small minority of Labour members of Parliament opposed military action.
The Blair government also faced a slowing economy and a widespread perception that public services such as health, education, and transportation had not improved.
Although large amounts of public money had been spent, particularly on the health service, much of this went into elaborating the new and highly evolved structures of management that came to characterize Labour administration of the state.
However, it was the subject of the Iraq War, and Britain’s support for the U.S. position on it, that did most to undermine the standing of Blair.
Weapons of mass destruction and the Iraq War
From late 2002, politics in Britain was dominated by Blair’s decision to support military action to oust from power the Iraqi government of Saddam Hussein, which was alleged to either possess or be developing weapons of mass destruction (WMD) that might either be used against Iraq’s neighbours or find their way into the hands of international terrorists.
Notwithstanding widespread and enormous public protests against war, the resignation of several government ministers, and the support of some one-third of the parliamentary Labour Party for a motion opposing the government’s policy, Blair remained steadfast in his conviction that Saddam was an imminent threat that had to be removed.
Following Saddam’s ouster, however, British and American intelligence was found to have been faulty. When no WMD were found, critics of the government charged that it had distorted (“sexed up”) intelligence to solidify its claims against the Iraqis. Nevertheless,
in May 2005 Blair won another term as prime minister—albeit with a significantly reduced parliamentary majority—as Labour won its third consecutive general election for the first time in the party’s history.
The fallout from the Iraq War—initially the controversy over the decision to go to war in the first place and then the protracted involvement in a conflict that began to look more and more like a civil war—sapped public and political support for Blair.
But, ever the consummate politician, he held on for two years after his reelection despite the friction between himself and his appointed successor, Gordon Brown, who became the new prime minister in June 2007
The Gordon Brown government (2007–10)
Brown’s hold on power was threatened in spring 2009. With the British economy already shaken by the spreading worldwide recession engendered by the financial crisis of late 2008, a scandal broke involving many dozens of members of Parliament who had extravagantly abused their government expense accounts, including members of Brown’s cabinet.
The scandal and the troubled economy contributed to anemic performances by the Labour Party in local elections in Britain and in those for the European Parliament. Brown responded with a thorough reshuffle of his cabinet and withstood a challenge to his leadership from within the party in early June by promising to change his leadership style.
.Conservative-Liberal Democrat coalition rule (2010–15)
The U.K. general election of 2010,
Nevertheless, Brown’s popularity and that of his party continued to wane as a general election, called for May 6, 2010, approached. The campaign brought a novelty to the British general election campaign—televised debates between the leaders of the three main parties:
Brown of the Labour Party, David Cameron of the Conservative Party, and Nick Clegg of the Liberal Democrats. Clegg’s outstanding performance in the first debate resulted in a surge in the preelection polls for the Liberal Democrats—who passed Labour to challenge the Conservative lead and to create both unprecedentedly high expectations for the Liberal Democrats and doubt as to whether any party would be able to secure enough seats to form a majority government.
In the event, the Liberal Democrats actually obtained fewer seats in 2010 than in the 2005 election. The Conservatives finished as the largest party, winning 306 seats, but they finished 20 seats shy of a majority.
The resulting “hung parliament” ironically placed the Liberal Democrats as potentially holding the balance of power. Labour finished with 258 seats, a fall of 91 seats over the 2005 election.
When negotiations to form a Liberal Democratic–Labour coalition failed, the Liberal Democrats joined the Conservatives in a coalition government led by Cameron, who became prime minister on May 11, and Clegg, who became deputy prime minister.
In October the government announced a five-year austerity plan aimed at reducing the country’s massive deficit, which had been fueled by bank bailouts and stimulus spending in the wake of the 2008 financial crisis and resultant recession.
The plan incorporated some of the British government’s deepest spending cuts since World War II, including reductions to welfare entitlements and the dismissal of up to 500,000 public-sector employees, as well as phasing in a pension eligibility age increase from 65 to 66 four years earlier than had been planned.
In December Parliament voted to raise the ceiling on university tuition from the existing cap of £3,290 (about $5,200) to £9,000 (about $14,000), prompting a series of demonstrations and causing dissension in the coalition government.
The Liberal Democrats had campaigned against the tuition hike during the general election, and some Liberal Democrat MPs continued to oppose it when it came to a vote. The rift in the coalition widened following Conservative opposition to the Liberal-Democrat-supported referendum on a proposal to replace the country’s first-past-the-post election method with the alternative vote.
The vote on the referendum, which was soundly rejected by the British public, was taken as part of local elections in May 2011, in which the Conservatives’ share of English council constituencies increased moderately but that of the Liberal Democrats plummeted, to the benefit of Labour.
Although there were some calls for Clegg to step down, support for him among Liberal Democrats generally remained strong. The election also resulted in a sweeping victory for the Scottish National Party, which secured the first majority government in the history of the Scottish Parliament, emboldening First Minister Alex Salmond to announce that he would seek to hold a referendum on independence.
Intervention in Libya
As the outbreak of popular uprisings in the Middle East and North Africa known as the Arab Spring unfolded in early 2011, the revolt in Libya and Libyan ruler Muammar al-Qaddafi’s brutal repression of it became a particular focus of British attention.
Although Cameron was criticized for the less-than-efficient removal of British nationals from Libya and for a botched effort by British special forces to contact the anti-Qaddafi rebels, he remained adamant in his criticism of Qaddafi and in his call for foreign intervention to protect the rebels from the Qaddafi regime’s superior forces, most notably with the enforcement of a no-fly zone. Cameron and French Pres.
Nicolas Sarkozy were instrumental in steering the UN Security Council to authorize military action on March 17. Beginning March 19, a coalition of U.S. and European forces with warplanes and cruise missiles attacked targets in Libya in an effort to disable Libya’s air force and air defense systems so that the UN-authorized no-fly zone could be imposed.
On March 27 NATO officially took command of military operations in Libya previously directed by the United States, France, and the United Kingdom.
News of the World hacking scandal
In April 2011 much of the world’s attention was directed at Britain for the wedding in London of Prince William (the grandson of Queen Elizabeth and second in line to the throne after his father, Prince Charles) to Catherine Middleton (see Prince William and Catherine Middleton: The Royal Wedding of 2011).
In July a scandal that had been smoldering since 2005 broke out in full flame when the News of the World, one of the flagship newspapers of Rupert Murdoch’s News International media empire, ceased publication after it became clear that a number of the paper’s reporters and editors had engaged in or condoned the illegal hacking of telephone voice mails of some 4,000 Britons,
including a child murder victim, the families of soldiers killed while serving in Afghanistan and Iraq, celebrities, politicians, and the British royal family. An earlier investigation had failed to reveal the extent of these violations of privacy (prompting later charges of law enforcement ineptitude and corruption) but led to the resignation of the editor of News of the World, Andy Coulson, in 2007. It did not prevent him from becoming the communications chief for Cameron when he took office, however.
When the scandal began to grow, in January 2011 Coulson stepped down. By the middle of July, in addition to the shuttering of News of the World, the scandal had resulted in the resignation of Rebekah Brooks, the politically powerful chief executive officer of News International, and in the withdrawal of Murdoch’s bid to buy a controlling share of the BSkyB satellite television channel. It also brought about the convening of a number of special parliamentary hearings and commissions.
The 2011 riots, the European sovereign debt crisis, and Cameron’s veto of changes to the Lisbon Treaty
On the night of August 6 a different sort of firestorm broke out when a protest against the killing of a young man by police earlier in the week erupted in widespread rioting in the North London area of Tottenham. In the succeeding days, riots, looting, and arson, mostly by young people, escalated wildly and became the worst rioting that the capital had seen in decades. The riots spread not only to other areas of Greater London but also to other British cities including Liverpool, Birmingham, and Bristol. Largely as a result of the increased deployment of police, however, the riots abated quickly. In the ensuing months, legal authorities used video footage of the events to arrest looters.
Although the United Kingdom remained outside of the euro zone, it was anything but unaffected by the events of the European sovereign debt crisis triggered by Greece’s financial collapse in 2009. Because many of Britain’s principal trading partners were euro-zone members, their economic woes impacted directly on the already sluggish economy of a Britain struggling mightily to reduce its deficit and combat unemployment.
Cameron created controversy in December 2011 when he effectively vetoed changes to the Lisbon Treaty (negotiated at an EU summit) that would have increased economic integration among the EU countries and imposed sanctions on members that surpassed an agreed-upon deficit limit.
His actions strained the Conservatives’ coalition partnership with the Liberal Democrats and were criticized by Deputy Prime Minister Clegg, who called them “bad for Britain,” as well by French President Sarkozy, who said there were now two Europes—one that wanted “more solidarity between its members and more regulation” and another that was “attached only to the logic of the single market.”
The 2012 London Olympics, Julian Assange’s embassy refuge, and the emergence of UKIP
Britain was the centre of world pomp and pageantry in 2012 not just because of the festive celebration of the 60th anniversary of Elizabeth II’s ascent to the throne but because of London’s hosting of the Summer Olympic Games, which, despite initial concerns about inadequate security, were widely regarded as a smashing success.
Then in mid-August 2012 the British government took issue with Ecuador’s granting of political asylum to WikiLeaks founder Julian Assange, who had taken refuge in that country’s embassy in London after exhausting appeals under the British legal system to avoid extradition to Sweden on sexual assault charges.
In spring 2013 the government announced that GDP had grown slightly in the first quarter of the year, thus preventing a slide back into a recession that would have been the third to afflict Britain in five years.
Nevertheless, in local elections in parts of England and Wales in May, voters showed their dissatisfaction with the nature of the economic recovery by turning away from Conservatives and Liberal Democrats, who together lost 459 seats on English local councils.
Labour made significant gains (picking up some 291 seats), but the big story of the election was the tremendous jump in representation by United Kingdom Independence Party (UKIP), which advocates withdrawal from the EU. By gaining 139 seats, UKIP made one of the most impressive showings by a fourth party in recent British electoral history.
The birth of George, rejection of intervention in Syria, and regulation of GCHQ
On July 22, 2013, with much of the world’s media on “royal baby watch,” Catherine, duchess of Cambridge, gave birth to a boy, George, who became third in the line of succession to the British throne.
In late August, Cameron was dealt a stunning blow by Parliament’s rejection of his proposed response to events in the Syrian Civil War. Amid reports that hundreds of Syrians had been killed earlier in the month in an alleged chemical attack by the regime of Syrian Pres. Bashar al-Assad,
Cameron asked the House of Commons to endorse in principle British military intervention in Syria. Although approval had been widely expected, MPs rejected the proposed involvement by a vote of 285 to 272—largely,
it seemed, as a result of the fear of a rush to judgment regarding the cause of and responsibility for the Syrian deaths and a lack of faith in the certainty of intelligence reports in light of British involvement in the Iraq War.
In October 2013 an impassioned debate arose in the press and in Parliament over whether there was a need for greater regulation of surveillance practices of the Government Communications Headquarters (GCHQ) in the wake of revelations that the agency had far greater eavesdropping capabilities than had previously been acknowledged publicly.
Cameron and others condemned the news leaks (originating with former U.S. CIA and National Security Agency employee Edward Snowden) that had led to the revelations and claimed that the activities of the GCHQ were lawful and necessary for national security.
The relationship of the government to the press had also been strained by the repercussions of a scandal in 2011 that involved press hacking of private phone calls, led to a public inquiry guided by Lord Justice Sir Brian Leveson, and resulted in the establishment of a new press watchdog system (also in October 2013).
The British political landscape was shaken again in May 2014, when UKIP not only made more dramatic gains in local council elections (increasing its representation by more than 150 seats) but also finished first in elections for the European Parliament, becoming the first party other than Labour or the Conservatives in more than 100 years to triumph in a British national election.
Capitalizing upon a growing wave of Euroskepticism that had a similar impact on elections to the European Parliament elsewhere (most notably in France, Denmark, and Hungary), the virulently anti-EU UKIP won 27 percent of the vote and 24 seats (a gain of 11 seats) to 20 seats for Labour (a gain of 7) and 19 seats for the Conservatives (a loss of 7).
The biggest loser in the election was the generally pro-EU Liberal Democratic Party, which fell from 11 seats to a single representative, mirroring the tumble it took in the local elections and prompting some in the party to call for Clegg’s replacement as leader. Scottish independence referendum
On September 18, 2014, a referendum was held in Scotland on independence from the United Kingdom. The referendum, which had been agreed to by Cameron in 2012, asked a single simple question:
“Should Scotland be an independent country?” Vigorous campaigns had been conducted on both sides of the question, with former prime minister Brown playing a prominent role in opposition to the referendum and proposing a plan that called for codification of the purpose of the United Kingdom, for recognition of the Scottish Parliament as permanent and indissoluble, and for increased income taxing powers for the Scottish government.
Although opinion polls had long indicated that a solid majority of Scots opposed independence, as the day of voting approached, the “yes” side had gained tremendous momentum, and polling indicated that the outcome was very much in question, with the “no” side holding a slight edge.
With the vote just days off, Cameron, Clegg, and Labour Party leader Ed Miliband had jointly published in the newspaper Daily Record a “vow” to increase powers for Scotland’s government if the referendum was rejected. On the day of the vote, some 85 percent of registered voters went to the polls and convincingly defeated the referendum, with about 55 percent voting “no” and about 45 percent voting “yes.” Following the result,
Cameron promised to move swiftly to redeem his promise to devolve more powers to Scotland. He appointed an all-party commission, led by Lord Smith of Kelvin, to consider the details.
The U.K.’s economy grew by about 3 percent in 2014. By the end of the year, it had reversed the decline that it suffered during the recession that started in 2008.
Unemployment, which had peaked at 8.5 percent in 2011, fell to 6 percent in the second half of 2014. However, wages continued to rise more slowly than inflation. The combination of low pay raises and the expansion of low-wage jobs meant that tax revenues during the year were lower than expected.
That shortfall contributed to a rise in the government’s net deficit, which toward the end of 2014 was running about 10 percent higher than during the same period a year earlier.
On September 26, 2014, MPs voted 524–43 to approve British participation in the U.S.-led air strikes against the Islamic State in Iraq and the Levant (ISIL, also called ISIS) insurgents in Iraq.
Cameron made clear that the action would be limited to Iraq and that Britain would not attack ISIL in Syria. Further, he emphasized that Britain would not send troops to take part in a ground war
David Cameron on his own (2015–16)
The U.K. general election of 2015
Opinion polling right up to the day before voting indicated that the May 2015 U.K. general election might be the closest in recent memory, as a single percentage point separated the Conservative and Labour parties in most polls. Immigration, the government’s austerity policies, the future of the National Health Service, and Britain’s continued membership in the EU were among the key issues in the campaign.
Attempting to address Euroskeptics in his own party and the challenge of UKIP, Cameron promised to renegotiate the terms of British participation in the EU and to put continued EU membership to a national referendum by the end of 2017 if he were reelected.
The Conservatives also intimated that if Labour were to win with less than a majority, it would likely form a coalition with the Scottish National Party (SNP) that would drive the government’s agenda with its desire for independence.
When the votes were counted, Cameron and the Conservatives defied the pollsters by capturing 331 seats (a gain of 24 over their showing in 2010), enough to form a majority government without the participation of the Liberal Democrats, whose fortunes plummeted as their party’s representation fell from 57 seats to 8, prompting Clegg’s resignation.
Labour leader Miliband stepped down too, after his party won only 232 seats (down 26 from 2010), and watched the SNP blow away Labour’s traditional dominance of elections in Scotland for the U.K. Parliament by increasing its representation in Westminster from 6 seats to 56.
Although it captured some 13 percent of the total vote, UKIP won only one seat—a consequence of Britain’s winner-take-all election rules—and Farage, who failed to be elected in his constituency, joined the list of resigning party leaders.
The “Brexit” referendum
On December 2, 2015, in the wake of the attacks by Islamist terrorists in Paris on November 13, the House of Commons authorized air strikes by the British military on ISIL targets in Syria. The vote on the measure came after some 10 hours of debate.
Labour leader Jeremy Corbyn freed members of his party to vote their conscience, and dozens of them broke ranks to join the Conservatives and others in voting for authorization, which passed 397–223
At a summit meeting of the leaders of the member countries of the EU in Brussels in February 2016, the European Council announced agreement on reforms to British membership that had been requested by Cameron in an attempt to forestall British withdrawal (“Brexit”) from the EU.
Although Cameron did not get everything that he had asked for in the proposal that he submitted to Donald Tusk, the president of the European Council, in November 2015, he won enough concessions to move forward on his promise of a referendum on continued British membership.
In the face of considerable support within his own party for Brexit, Cameron nevertheless announced that he would campaign for remaining in the EU and scheduled the referendum for June 23, 2016.Cameron was joined in the “Remain” effort by Corbyn. The “Leave” campaign was headed by former London mayor Boris Johnson, whom many saw as a rival for Cameron’s leadership of the Conservative Party, and Michael Gove, lord chancellor and secretary of state for justice in Cameron’s cabinet.
Opinion polling indicated that the two sides were fairly evenly divided as the referendum approached, but in the event 52 percent of voters opted to leave the EU, making the United Kingdom the first country to ever do so.
Cameron announced his intention to resign as prime minister by the time of the Conservative Party conference in October 2016 to allow his successor to negotiate the U.K. withdrawal under the terms of Article 50 of the Lisbon Treaty, which, when triggered, would open a two-year window for the exit process.
The premiership of Theresa May (2016– )
The resignation of Cameron, the rise of May, and a challenge to Corbyn’s leadership of Labour
Only days after the Brexit vote, the political drama surrounding Johnson’s pursuit of the Conservative leadership assumed what many observers identified as Shakespearean proportions as Gove removed his prominent support for Johnson’s candidacy, saying that Johnson was “not capable of…leading the party and the country in the way that I would have hoped.”
In rapid fashion, a wounded Johnson removed himself from consideration. Gove then threw his hat into the small ring of leadership candidates that was then winnowed by successive votes by parliamentary Conservatives in early July to Home Secretary Theresa May and Energy Minister Andrea Leadsom, whose names were put to a vote by all party members with results due in September.
Almost before that process started, Leadsom unexpectedly withdrew her name from consideration, and on July 11 the Conservative Party’s 1922 Committee, which had been steering the leadership contest, declared May the new party leader “with immediate effect.” On July 13 Cameron formally resigned, and May became the second woman in British history to serve as prime minister.
Meanwhile, Labour underwent its own leadership controversy as prominent party members, including Blair, took Corbyn to task for not mounting a more vigorous effort on behalf of the “Remain” campaign.
No sooner had Blair made his criticism than he found himself in the crosshairs, with the release on July 5 of the so-called Chilcot Report, the findings of a seven-year inquiry into Britain’s involvement in the Iraq War, which was scathing in its condemnation of Blair’s handling of the war from the initial decision to join the United States in invading Iraq to the Blair government’s failure to plan and prepare for the postwar aftermath in Iraq. Nonetheless,
a challenge was mounted to Corbyn’s leadership of the party that eventually resulted in a head-to-head contest between Corbyn and Owen Smith, the former shadow secretary of work and pensions.
In an online vote of party faithful in September, Corbyn held on to the leadership by capturing some 62 percent of the vote against about 38 percent for Smith.
Triggering Article 50
In the meantime, May, who had opposed Brexit but came into office promising to see it to completion, led her government in cautious movement toward triggering Article 50.
Her efforts experienced a setback in January 2017, however, when the Supreme Court upheld a November 2016 High Court ruling that prevented the prime minister from triggering Article 50 without first having gained approval from Parliament to do so.
In February 2017 the House of Commons granted May that approval by a 498–114 vote, but the House of Lords created another roadblock in early March by adding a pair of amendments to the bill authorizing May to invoke Article 50.
One guaranteed that EU passport holders residing in Britain would be permitted to remain, and the other sought a greater role for Parliament in the negotiations. Both amendments were overturned by the House of Commons later in March, and, before the end of the month, May formally submitted a letter to European Council Pres. Donald Tusk requesting the opening of the two-year window for talks
British separation from the EU.
Against this backdrop, the Scottish Assembly backed First Minister Nicola Sturgeon’s call for a new referendum on independence for Scotland to be held before spring 2019 (the majority of Scottish voters had opposed leaving the EU in the Brexit referendum).The Manchester arena bombing and London bridge attacks
In mid-April 2017 May called for a snap parliamentary election, saying that its results would provide stability and certainty for Britain during its Brexit negotiations and transition out of the EU. To hold an election ahead of the 2020 date mandated by the Fixed-term Parliaments Act 2011, May needed to win two-thirds majority approval in the House of Commons.
Corbyn welcomed a return to the polls, despite opinion polling that predicted big gains for the Conservatives, and, by a vote of 522 to 13 (with SNP members abstaining), the House of Commons approved a snap election for June 8.
The election campaign was temporarily suspended after 22 people were killed and dozens injured in a terrorist attack on the night of May 22 at a 21,000-capacity arena in Manchester following a concert by U.S. singer Ariana Grande.
The attacker who detonated the homemade bomb that wrought the destruction also was killed in the blast. ISIL claimed responsibility for the attack, in which many of those who perished or were injured were children—
teenaged and younger fans of the American pop star. It was the deadliest terrorist attack in Britain since the London bombings of 2005, in which more than 50 people were killed, and it followed an attack on Westminster Bridge in London on March 22 in which an attacker mowed down pedestrians with a car and then continued his assault on foot with a knife, taking five lives and injuring some 50 people before he was killed outside the Houses of Parliament by a security officer.
On June 3, five days before voters were to go to the polls, yet another terrorist attack unfolded in London. This time it occurred on London Bridge, where three attackers ran down victims with a vehicle before leaving it to menace others in nearby Borough Market with knives. Eight people were killed before police arrived, only eight minutes after the start of the incident, and shot and killed the attackers.
The snap election campaign
In addition to using the campaign to sell her version of “hard Brexit,” May sought to frame the election as a choice between her “strong and stable” leadership and that of Corbyn, who was characterized as an unreliable out-of-touch leftist extremist.
However, Corbyn, once thought by many observers to be unelectable, proved to be an inspiring campaigner whose message of hope, compassion, and inclusiveness energized a new generation of Labour voters. May, on the other hand, often appeared uncomfortable, stiff, and uncertain on the campaign trail.
One element of her manifesto—a proposal to pay for in-home social care of the elderly with government sales of their homes after their deaths, a plan loudly condemned by many as a “dementia tax”—brought widespread outrage that prompted her to quickly alter the proposal.
Rather than appearing “strong and stable,” May, in the eyes of some observers, looked to be “weak and wobbly.”The 2017 U.K. general election
When voters had their say on June 8, 2017, they handed the Conservatives a major setback. Rather than securing a mandate, May watched her party’s legislative majority disappear as it lost at least 12 seats in the House of Commons to fall to 318 seats while Labour gained at least 29 seats to surpass 260 seats in total.
Both parties garnered more than 40 percent of the popular vote each in an election that witnessed a return to dominance by the two major parties. Led by Tim Farron, the Liberal Democrats, who had fared badly in the 2015 election, sought to reverse their fortunes by advocating another referendum on Brexit, and, while this proposal did not resonate for many voters, the party still gained four seats to reach a total of 12. Support for UKIP largely evaporated.
Having nearly realized the goal of Brexit, many of those who had supported UKIP in previous elections were expected to vote for the Conservatives, but, in the event, it appeared that they instead were swayed by Corbyn’s vision. The Conservatives did,
however, make big gains in Scotland, where the Scottish National Party fell from 56 seats to 35, in what was widely interpreted as a rebuke to Sturgeon and the SNP’s call for another referendum on Scottish independence.
Arguably the election’s biggest winner was Northern Ireland’s Democratic Unionist Party (DUP). Having increased its representation in the House of Commons from 8 to 10 seats, it found itself in the role of kingmaker when May enlisted its support to cling to power by forming a minority government (rather than seeking a formal coalition arrangement). With the support of the DUP on key votes, the Conservatives would be able to just barely surpass the 326-vote bar for a legislative majority.
The central task for May’s government remained arriving at a cohesive approach for its Brexit negotiations with the EU. That task was a daunting one, however, because wide disagreement persisted even within the Conservative Party, not just on a myriad of details related to the British proposal for separation but also on the broader issues involved.
The Grenfell Tower fire, a novichok attack in Salisbury, and air strikes on Syria
In June 2017 Brexit was pushed off the front pages by one of the worst disasters in recent British history: a fire in a multistory public housing residence (Grenfell Tower) in London claimed the lives of 72 individuals, many of whom were recent immigrants.
The incident prompted a period of national soul-searching after it was revealed that months before the fire the building’s low-income residents had raised concerns about fire safety and complained that they were being treated like second-class citizens.
In March 2018 British national outrage was focused on Russia when a former Russian intelligence officer, who had acted as double agent for Britain, and his daughter were found unconscious in Salisbury,
England. It was determined that the pair had been victims of a “novichok,” a complex nerve agent that had been developed by the Soviets. Although the Russian government denied having any involvement with the attack and British investigators were unable to prove that the nerve agent originated in Russia, the May government responded by expelling some two dozen Russian intelligence operatives who had been working in Britain under diplomatic cover.
In April Britain joined France and the United States in launching air strikes against targets in Syria after it was revealed that the regime of Syrian Pres. Bashar al-Assad had again used chemical weapons on its own people.
Corbyn was critical of May for having ordered the strike without first consulting Parliament, but she countered that the action had to be undertaken without seeking parliamentary approval in order to protect the operation’s integrity. May also said that the strike was intended to prevent further suffering, and she characterized the decision as both right and legal.
The wedding of Prince Harry and Meghan Markle, the Chequers plan, and Boris Johnson’s resignation
In May 2018 Britain and much of the world stopped for a day to witness the royal wedding of Prince Harry to Meghan Markle—a divorced
American actress, daughter of an African American mother and a white father—whose informal approachability and personal warmth recalled the much beloved “People’s Princess” Diana.
The newlywed couple’s union reflected the changing social landscape of an increasingly multicultural Britain. Moreover, they seemed determined to modernize the monarchy and to connect it with the lives of everyday Britons.
In early July May summoned her cabinet to the prime minister’s country retreat, Chequers, determined to forge a consensus on the nuts and bolts of the government’s Brexit plan.
Despite forceful opposition by the cabinet’s “hard” Brexiters, by the end of the marathon meeting a consensus seemed to have emerged around May’s “softer” approach, grounded in policies aimed at preserving economic ties with the EU.
Just two days later, however, the government’s apparent harmony was disrupted by the resignation of Britain’s chief Brexit negotiator, David Davis, who complained that May’s plan gave up too much, too easily.
The next day Johnson left his post as foreign secretary, writing in his letter of resignation that the dream of Brexit was dying, “suffocated by needless self-doubt.” Suddenly confronted with the possibility of a vote of confidence on her party leadership, May reportedly cautioned Conservatives to line up behind her Brexit plan or run the risk of losing power to a Corbyn-led Labour government.
EU agreement and Parliamentary opposition to May’s Brexit plan
On November 25 the leaders of the EU’s 27 other member countries formally agreed to the terms of a withdrawal deal that May claimed “delivered for the British people” and set the United Kingdom “on course for a prosperous future.” Under the plan Britain was to pay some $50 billion to the EU to satisfy its long-term financial obligations.
Britain’s departure from the EU was to come in March 2019, but, according to the agreement, the U.K. would continue to abide by EU rules and regulations until at least December 2020 while negotiations continued on the details of the long-term relationship between the EU and the U.K.
The agreement, which was set to be debated and voted upon by the House of Commons in December, still faced strong opposition in Parliament, not only from Labour, the Liberal Democrats, the SNP, Plaid Cymru, and the DUP but also from dozens of Conservatives.
At the same time, the call for holding another referendum on Brexit was growing louder, though May remained adamant that the will of the British people had already been expressed.
A major sticking point for many of those who opposed the agreement was the so-called Northern Ireland backstop plan. Formulated to help maintain an open border between Northern Ireland and EU member Ireland after Brexit, the “backstop” stipulated that a legally binding customs arrangement between the EU and Northern Ireland would go into effect if the U.K. and the EU could not reach a long-term agreement by December 2020.
Opponents of the backstop argued that it set up the potential for regulatory barriers between Northern Ireland and the rest of the U.K., effectively establishing a customs border down the Irish Sea.
Objections to the Irish backstop and a challenge to May’s leadership
The issue grew more heated in the first week of December after the government was forced to publish in full Attorney General Geoffrey Cox’s legal advice for the government on the Brexit agreement, which had initially been reported to Parliament in overview only.
According to Cox, without agreement between Britain and the EU, the terms of the backstop plan could endure “indefinitely,” with the U.K. legally blocked from terminating the agreement without EU approval. This contentious issue was front and centre as the House of Commons began five days of debate leading up to a vote on the Brexit agreement that was scheduled for December 11.
Facing the likelihood of a humiliating rejection of the agreement by the House of Commons, May dramatically interrupted the debate after three days, on December 10, and postponed the vote, pledging to seek new assurances from the EU regarding the backstop. The opposition responded by threatening to hold a vote of confidence and to call for an early election.
A challenge to May’s leadership was quickly mounted within the Conservative Party, and, after more than the required 15 percent of the parliamentary party (48 of 317 MPs) requested a vote on her leadership of the party, a secret ballot vote was held on December 12, 2018.
May received the votes of 200 MPs, more than the 159 votes she needed to survive as leader. Although, according to Conservative Party rules, she could not be challenged as leader for another year, it remained to be seen whether May would still face pressure to relinquish power. Parliamentary rejection of May’s plan, May’s survival of a confidence vote, and the Independent Group of breakaway MPs
Responding to May in a joint letter, European Council Pres. Donald Tusk and European Commission Pres. Jean-Claude Juncker indicated that, if the backstop had to be invoked, they would strive to limit its application to the “shortest possible period.”
However, this pledge satisfied few of the agreement’s critics. When debate on the agreement resumed on January 9, Corbyn argued not only for rejection of the agreement but also for an early general election.
On January 15 the agreement was overwhelmingly rejected by a vote of 432–202 (the worst defeat for a government initiative in modern British parliamentary history), and Corbyn tabled a vote of confidence in the government, which May survived the next day, 325–306, having held onto the support of the DUP and many Conservatives who had deserted her in the agreement vote.
The longer the issue of Brexit remained unsettled, the more it became the fulcrum on which British politics turned. Political pundits began to note that opinions on May’s proposed version of Brexit and Brexit in general cut across ideological lines. Both Labour and the Conservative Party were riven by internecine conflict over Brexit.
In February eight MPs withdrew from the Labour Party, citing their disappointment in Corbyn’s leadership on the issue as well as concerns over alleged anti-Semitism within the party, a criticism that was at least partly tied to Corbyn’s sympathy for Palestinian concerns.
Only days after their departure, three moderate Tories left the Conservative Party, protesting that it had been hijacked by the European Research Group, a faction of right-wing hard-line Brexiters whom the departing MPs accused of acting as a party within the party. Joining together as the Independent Group, these breakaway MPs from both parties began taking steps toward formally constituting a new political party.
Meanwhile, in early March, Tom Watson, the deputy leader of the Labour Party, convened a meeting of Labour MPs and members of the House of Lords—many of whom felt that Corbyn had taken the party too far leftward—to consider an alternative vision for the party.
Parliament rejects May’s plan again
Against this backdrop, May continued negotiations with European leaders in an effort to win concessions that would garner wider support within Parliament than the terms of her earlier, shunned Brexit plan did. On the eve of a scheduled meaningful vote in the House of Commons on her revised plan, May secured new promises of cooperation on the backstop plan from EU leaders.
A “joint legally binding instrument” was agreed to under which Britain could initiate a “formal dispute” with the EU if the EU were to attempt to keep Britain bound to the backstop plan indefinitely. A “joint statement” was also issued that committed the U.K. and the EU to arriving at a replacement for the backstop plan by December 2020.
Finally, the U.K. put forth a “unilateral declaration” stressing that there was nothing to prevent Britain from abandoning the backstop if negotiations on an alternative arrangement with the EU were to collapse without the prospect of resolution.
In advance of the vote in Parliament, Attorney General Cox issued his opinion that while the new assurances reduced the risk of the U.K.’s being indefinitely confined by the backstop agreement, they did not fundamentally change the agreement’s legal status. In the vote on March 12, the House of Commons once again rejected May’s plan, though by a smaller margin than its earlier defeat, 391–242.
The next day the House of Commons voted 312–308 against leaving the EU without a deal in place. On March 14, by just two votes, May survived a vote that would have taken control of Brexit away from her and handed it to Parliament. In a letter to EU leaders on March 20, she requested that the date of Britain’s departure from the EU be delayed until June 30.
In response the EU announced its willingness to extend the Brexit deadline until May 22 but only if Parliament had accepted May’s withdrawal plan by the week of March 24.“Indicative votes,” May’s pledge to resign, a third defeat for her plan, and a new deadline
Hundreds of thousands of demonstrators took to the streets of London on March 23 to demand that another referendum on Brexit be held. On March 25 the House of Commons voted 329–302 to usurp control of Parliament’s agenda from the government in order to hold “indicative votes” on alternative proposals to May’s plan
. Eight of those proposals were put to a vote on March 27, but none was able to gain the support of the majority, though a plan to seek to create a “permanent and comprehensive U.K.-wide customs union with the EU” came close, falling sort by just six votes.
Also on March 27, May pledged to resign as party leader and prime minister if the House of Commons were to approve her plan, a gambit that won support from some “hard Brexit” opponents of the plan.
On March 29, owing to an antique procedural rule invoked by Speaker of the House John Bercow, only the withdrawal agreement portion of May’s plan was voted upon by the House of Commons (excluded was the “political declaration” that addressed what the U.K. and EU expected of their long-term relationship).
Although the vote was closer than the previous two (286 in support, 344 in opposition), the plan once again went down in defeat. The U.K. now had until April 12 to decide whether it would leave the EU without an agreement on that day or request a longer delay that would require it to participate in elections for the European Parliament. May asked the EU to push back the deadline for Brexit until June 30, and on April 11 the European Council announced that it was granting the U.K. a “flexible extension” until October 31.
Shortly thereafter, in response to the Conservative Party’s seeming inability to position the country to leave the EU, Nigel Farage launched the Brexit Party. It proved to be a big winner in the elections for the European Parliament in May, capturing about 31 percent of the vote.
The next closest finisher was the Liberal Democrats, with about 20 percent of the vote, while Labour claimed some 14 percent and the Conservatives only about 9 percent.
Having failed to garner sufficient support from Conservatives for her exit plan, May entered discussions with Labour leaders on a possible compromise,
but these too proved fruitless. When May responded to that disappointment by proposing a new version of the plan that included a temporary customs relationship with the EU and a pledge to hold a parliamentary vote on whether to stage another referendum on Brexit, her cabinet revolted.
Isolated as never before, the prime minister announced on May 24 that she would step down as leader of the Conservative Party on June 7 but would remain as caretaker premier until her party had chosen her successor.
Boris Johnson’s ascension, the December 2019 snap election, and Brexit
After a series of votes by the parliamentary Conservative Party winnowed a list of 10 candidates to 2, Boris Johnson and Jeremy Hunt stood in an election in which all of the party’s roughly 160,000 members were eligible to vote. Johnson took some 66 percent of that vote to assume the leadership. He officially replaced May as prime minister on July 24.
Although he had promised to take the United Kingdom out of the EU without an exit agreement if the deal May had negotiated was not changed to his liking, Johnson faced widespread opposition (even within his own party) to his advocacy of no-deal Brexit.
Political maneuvering by the new prime minister (including proroguing Parliament just weeks before October 31, the revised departure deadline) was met with forceful legislative countermeasures by those opposed to leaving the EU without an agreement in place.
A vote of the House of Commons in early September forced Johnson to request a delay of the British withdrawal from the EU until January 31, 2020, even though on October 22 the House approved, in principle, the agreement that Johnson had negotiated, replacing the backstop with a plan to keep Northern Ireland aligned with the EU for at least four years from the end of the transition period.
Johnson repeatedly tried and failed to call a snap election that he hoped would secure a mandate for his vision of Brexit. Because the election would fall outside the five-year term stipulated by the Fixed Terms of Parliament Act, it required approval by two-thirds of the House of Commons to be held, meaning that it needed support from the opposition, which was denied.
After no-deal Brexit was blocked, however, Corbyn was willing to let voters once again decide the fate of Brexit, and an election was scheduled for December 12. Preelection opinion polling indicated a likely win for the Conservatives, but when the results were in, Johnson’s party had recorded its most decisive victory since 1987, adding 47 seats to secure a solid Parliamentary majority of 365 seats.
The stage was set for the realization of Johnson’s version of Brexit, which was to take place at 11:00 p.m. London time on January 31, when the United Kingdom formally would withdraw from the European Union
Society, state, and economy State and society
Despite the so-called “dismantling of controls” after the end of World War I, government involvement in economic life was to continue, as were increased public expenditure, extensions of social welfare, and a higher degree of administrative rationalization. In the interwar years the level of integration of labour, capital, and the state was more considerable than is often thought.
Attempts to organize the market continued up to the beginning of World War II, evident, for example, in government’s financial support for regional development in the late 1930s.
Few Britons, however, felt they were living in a period of decreased government power. Nonetheless, attachment to the “impartial state” and to voluntarism was still considerable and exemplified by the popularity of the approved organizations set up to administer health insurance in the interwar years.
The governance of society through what were now taken to be the social characteristics of that society itself, for example, family life as well as demographic and economic factors—developed by Liberal administrations before World War I—along with the advent of “planning,” continued to be the direction of change, but the connection back to Victorian notions of moral individualism and the purely regulative, liberal state was still strong.
Even the greatest exponent of the move toward economic intervention and social government, John Maynard Keynes, whose General Theory of Employment, Interest, and Money (1935–36) provided the major rationale for subsequent state intervention and whose work downgraded the importance of private rationality and private responsibility, nonetheless believed that governmental intervention in one area was necessary to buttress freedom and privacy elsewhere, so that the moral responsibility of the citizen would be forthcoming.
There was, however, only an incremental increase in the level of interest in state involvement in the economy and society in the immediate years before World War II, when the fear of war galvanized politicians and administrators.
It was the “total war” of 1939–45 that brought a degree of centralized control of the economy and society that was unparalleled before or indeed since. In some ways this was an expression of prewar developments, but the impetus of the war was enormous and felt in all political quarters. In 1941 it was a Conservative chancellor of the Exchequer, Sir Kingsley Wood, who introduced the first Keynesian budget.
Cross-party support was also evident in the response to the 1942 Beveridge Report, which became the blueprint of what was later to be called the welfare state. After 1945 a decisive shift had taken place toward the recognition of state intervention and planning as the norm, not the exception, and toward the idea that society could now be molded by political will.
Nonetheless, there was much popular dislike of “government controls,” and the familiar rhetoric of the impartial state remained strong, as reflected in Beveridge’s attack in 1948 on the Labour government’s failure to encourage voluntarism.
This voluntarism, however, was decidedly different from 19th-century voluntarism in that Beveridge advocated a minister-guardian of voluntary action. So pervasive was the postwar party consensus on the welfare state that the term coined to identify it,
“Butskellism,” is at least as well remembered as the successive chancellors of the Exchequer—R.A. Butler and Hugh Gaitskell—from whose amalgamated surnames it was derived
From the 1960s onward this consensus began to unravel, with the perception of poor economic performance and calls for the modernization of British society and the British economy. The mixed economy came under pressure, as did the institutions of the welfare state, especially the National Health Service (NHS).
In the 1970s in particular, older beliefs in constitutional methods came into question—for instance, in the first national Civil Service strike ever, in 1973, and in the strikes and political violence that marked that decade as a whole. The result was a revolution in the relationship between state and society, whereby the market came to replace society as the model of state governance.
This did not, however, mean a return to 19th-century models, though the character of this manifestation of the relationship between state and society was clearly liberal, in line with the long British tradition of governance.
Institutionally, this way of governing was pluralistic, but its pluralism was decidedly statist. It was not, as in the 19th century, a private, self-governing voluntarist pluralism but one that was designedly competitive, enlisting quasi-governmental institutions as clients competing with one another in a marketplace.
In economic and cultural conditions increasingly shaped by globalization, the economy was exposed to the benign operations of the market not by leaving it alone but by actively intervening in it to create the conditions for entrepreneurship.
Analogously, social life was marketized too, thrown open to the idea that the capacity for self-realization could be obtained only through individual activity, not through society. Institutions like the NHS were reformed as a series of internal markets. These markets were to be governed by what has been called “the new public management.”
This involved a focus upon accountability, with explicit standards and measures of performance. The ethical change involved a transition from the idea of public service to one of private management of the self. Parallel to this “culture of accountability” was the emergence of an “audit society,”
in which formal and professionally sanctioned monitoring systems replaced the trust that earlier versions of relationship between state and society had invested in professional specialists of all sorts (the professions themselves, such as university teaching, were opened up to this sort of audit, which was all the more onerous because, if directed from above, it was carried out by the professionals themselves, so preserving the fiction of professional freedom)
The social state gave way to a state that was regarded as “enabling,” permitting not only the citizen but also the firm, the locality, and so on to freely choose. This politics of choice was in fact shared by the Thatcher’s Conservative administration and Blair’s Labour one. In both the state was seen as a partner.
In the so-called “Third Way” of Blair, one between socialism and the market, the partnership evolved much more in terms of community than in the Conservative case. In Blair’s Labour vision there was a more active concern with creating ethical citizens who would exchange obligations for rights in a new realization of marketized communities.
This new relation of state and society involved the decentralization of rule upon the citizen himself and herself, which was reflected in the host of self-help activities to be found in the Britain of the 1990s and 2000s, from the new concern with alternative health therapies to the self-
Reflecting this decentralization (in which the state itself made the citizen a consumer, for instance, of education and health) was the increasingly important role of the consumption of goods in constructing lifestyles through which individual choice could realize self-expression and self-fulfillment.
Economy and society
Economically, Britain had been hurt severely by World War I. The huge balances of credit in foreign currencies that had provided the capital for the City of London’s financial operations for a century were spent. Britain had moved from the position of a creditor to that of a debtor country.
Moreover, its industrial infrastructure, already out of date at the start of the war, had been allowed to depreciate and decay further. The industries of the Industrial Revolution, such as coal mining, textile production, and shipbuilding, upon which British prosperity had been built, were now either weakened or redundant.
The Japanese had usurped the textile export market. Coal was superseded by other forms of energy. Shipping lost during the war had to be almost fully replaced with more-modern and more-efficient vessels.
Finally, the Treaty of Versailles, particularly its harsh demands on Germany for financial reparations, ensured that foreign markets would remain depressed. Germany had been Britain’s largest foreign customer.
The export of German coal to France, as stipulated by the treaty, upset world coal markets for nearly a decade. Depression and unemployment, not prosperity and a better Britain, characterized the interwar years.
The British economy, as well as that of the rest of the world, was devastated by the Great Depression. The post-World War I world of reconstruction became a prewar world of deep depression, radicalism, racism, and violence.
Although MacDonald was well-meaning and highly intelligent, he was badly equipped to handle the science of economics and the depression. By the end of 1930, unemployment was nearly double the figure of 1928 and would reach 25 percent of the workforce by the spring of 1931.
It was accompanied, after the closing of banks in Germany in May, by a devastating run on gold in British banks that threatened the stability of the pound.
MacDonald’s government fell in August over the protection of the pound; Britain needed to borrow gold, but foreign bankers would lend gold only on the condition that domestic expenditures would be cut, and this meant, among other things, reducing unemployment insurance payments.
However, a Labour Party whose central commitment was to the welfare of the working people could not mandate such a course of action even in an economic crisis. Thus, the Labour cabinet resigned. MacDonald with a few colleagues formed a coalition with the Conservative and Liberal opposition on August 24, 1931.
This new “national” government, which allowed Britain to go off the gold standard on September 21, was confirmed in office by a general election on October 27, in which 473 Conservatives were returned while the Labour Party in the House of Commons was nearly destroyed, capturing only 52 seats.
MacDonald, who was returned to the House of Commons along with 13 so-called National Labour colleagues, remained prime minister nonetheless. The new government was in fact a conservative government, and MacDonald, by consenting to remain prime minister, became and remains in Labour histories a traitor.
Under Neville Chamberlain, who became chancellor of the Exchequer in November 1931, the coalition government pursued a policy of strict economy. Housing subsidies were cut; Britain ended its three-quarter-century devotion to free trade and began import protection; and interest rates were lowered.
Manufacturing revived, stimulated particularly by a marked revival in the construction of private housing made possible by reduced interest rates and by a modest growth in exports as a result of the cheaper pound. Similarly, unemployment declined, although it never reached the 10 percent level of the late 1920s until after the outbreak of war.
In terms of the occupational structure of Britain, the aftermath of World War I saw the decline of the great 19th-century staple industries become increasingly sharp, and the interwar experience of textiles was particularly difficult.
The great expansion of mining after 1881 became a contraction, particularly from the 1930s, and domestic service, which itself may be termed a staple industry, suffered similarly. In 1911 these sectors accounted for some 20 percent of the British labour force, but by 1961 they accounted for barely 5 percent. Manufacturing continued to be of great importance into the third quarter of the century, when the next great restructuring occurred.
After World War I an increasing emphasis on monopoly, scale, and sophisticated labour-management became apparent in British industry, though there was still much of the old “archaicism” of the 19th century to be seen, both in respect to management practices and the entrenched power of certain skilled occupations.
This was buttressed by a considerable degree of continuity in terms of residential community. After 1960 or so, the wholesale development of slum clearance and relocation to new residential settings was to go far to dissolve this older sense of identity.
From the interwar years automobile manufacture, the manufacture of consumer durables, and light industry, especially along the corridor between London and Birmingham, as well as in the new industrial suburbs of London, announced the economic eclipse of the north by the south, the “south” here including South Wales and industrial Scotland.
In the Midlands electrical manufacturing and automobile industries developed. In the south, in addition to construction industries, new service industries such as hotels and the shops of London flourished. These in particular offered employment opportunities for women at a time when the demand for domestic servants was in decline.
London grew enormously, and the unemployment rate there was half that of the north of England and of Wales, Scotland, and Northern Ireland.
The effect of these developments was to divide Britain politically and economically into two areas, a division that, with the exception of an interval during World War II and its immediate aftermath, still exists.
New, science-based industries (e.g., the electrical and chemical industries) also developed from the interwar period, which together with the multiplication of service industries and the growth of the public sector—despite repeated government attempts to halt this growth—had by 1960 given rise to an occupational structure very different from that of the 19th century.
On the surface the 1950s and early ’60s were years of economic expansion and prosperity. The economic well-being of the average Briton rose dramatically and visibly.
But when prosperity created a demand for imports, large-scale buying abroad hurt the value of the pound. A declining pound meant higher interest rates as well as credit and import controls, which in turn caused inflation. Inflation hurt exports and caused strikes. These crises occurred in approximately three-year cycles.
The economic concern then of the British government in the 1950s and ’60s and indeed through the 1970s was to increase productivity and ensure labour peace so that Britain could again become an exporting country able to pay for public expenditure at home while maintaining the value of its currency and its place as a world banker.
A drastic run on the pound had been one of the pressing reasons for the quick withdrawal from Suez in 1956, and throughout the 1950s and ’60s Britain’s share of world trade fell with almost perfect consistency by about 1 percent per year. On the other hand, Britain benefited from an unprecedented rise in tourism occasioned mostly by the attraction of “Swinging London.”
All of this made Britain’s decision, after fierce political discussion, not to join the planned EEC, established by the Treaty of Rome on March 25, 1957, an event of signal importance. It meant that although economic conditions in Britain did indeed improve in the last years of the 1950s and through 1960—
Prime Minister Harold Macmillan could remark with only slight irony that the British people had never “had it so good”—Britain nevertheless did not share in the astonishing growth in European production and trade led by the “economic miracle” in West Germany.
By the mid-1960s there were signs that British prosperity was declining. Increases in productivity were disappearing, and labour unrest was marked. Prime Minister Macmillan quickly realized that it had been a mistake not to join the EEC, and in July 1961 he initiated negotiations to do so. By this time, however, the French government was headed by Charles de Gaulle, and he chose to veto Britain’s entry.
BHer most dramatic acts consisted of a continuing series of statutes to denationalize nearly every industry that Labour had brought under government control in the previous 40 years as well as some industries, such as telecommunications, that had been in state hands for a century or more.
But perhaps her most important achievement, helped by high unemployment in the old heavy industries, was in winning the contest for power with the trade unions. Instead of attempting to put all legislation in one massive bill, as Heath had done, Thatcher proceeded step by step, making secondary strikes and boycotts illegal, providing for fines, as well as allocation of union funds, for the violation of law, and taking measures for ending the closed shop.
Finally, in 1984–85, she won a struggle with the National Union of Mineworkers (NUM), who staged a nationwide strike to prevent the closure of 20 coal mines that the government claimed were unproductive. The walkout, which lasted nearly a year and was accompanied by continuing violence, soon became emblematic of the struggle for power between the Conservative government and the trade unions. After the defeat of the miners, that struggle was essentially over;
Thatcher’s victory was aided by divisions within the ranks of the miners themselves, exacerbated by the divisive leadership of the militant NUM leader Arthur Scargill, and by the Conservative government’s use of the police as a national constabulary, one not afraid to employ violence. The miners returned to work without a single concession.
In all these efforts, Thatcher was helped by a revival of world prosperity and lessening inflation, by the profits from industries sold to investors, and by the enormous sums realized from the sale abroad of North Sea oil. From 1974 the unexpected windfall of the discovery of large oil reserves under the North Sea,
together with the increase in oil prices that year, transformed Britain into a considerable player in the field of oil production (production soared from 87,000 tons in 1974 to 75,000,000 tons five years later).
The political use of oil revenues was seen by some as characteristic of the failure of successive British governments to put them to good economic and social use.
The restructuring of the economy away from the manual and industrial sectors, which was a consequence of the rapid decline of manufacturing industry in Britain in the 1990s, also meant the decline of the old, manual working class and the coming of what has been called “postindustrial” or “postmodern” society.
Within industry itself, “post-Fordist” (flexible, technologically innovative, and demand-driven) production and new forms of industrial management restructured the labour force in ways that broke up traditional hierarchies and outlooks. Not least among these changes has been the expansion of work, chiefly part-time, for women.
There has been a corresponding rise of new, nonmanual employment, primarily in the service sector. In the early phases of these changes, there was much underemployment and unemployment.
The result has been not only the numerical decline of the old working class but the diminishing significance of manual work itself, as well as the growing disappearance of work as a fairly stable, uniform, lifelong experience.
The shift in employment and investment from production to consumption industries has paralleled the rise of consumption itself as an arena in which people’s desires and hopes are centred and as the basis of their conceptions of themselves and the social order.
However, in the 1990s there was a considerable move back to the workplace as the source of identity and self-value. At the same time, new management practices and ideas developed that were in line with the still generally high level of working hours.
Central to the new economy and new ideas about work has been the staggering growth of information technology. This has been especially evident in the operations of financial markets, contributing hugely to their global integration. One of the great beneficiaries of these changes has been the City of London, which has profited from very light state regulation.
The financial sector, in terms of international markets and the domestic provision of financial goods and services, has become a major sector of the new economy. Speculation in markets, with ever-increasing degrees of ingenuity (for example, the phenomenon of hedge fund trading), has helped create a cohort of the newly rich in Britain and elsewhere.
It has also led to an increasingly unstable world financial system. The spoils of this new society have been divided between large-scale multinational corporations and new kinds of industrial organizations that are smaller and often more responsive to demand, evident in development of the dot.com and e-commerce phenomena.
Internet shopping, along with the unparalleled development of giant supermarket chains, transformed the traditional pattern of retailing and shopping and, with it, patterns of social interaction. This, however, was only one aspect of a general transformation of the economy and society that even as recently as the early 1990s had hardly been glimpsed.
In the conditions of economic stability and prosperity at the turn of the 21st century, a relatively large middle group arose in terms of income, housing, and lifestyle that politicians and others began to refer to as “middle England.” In effect this meant Scotland and Wales as well, although
in Britain as a whole the old imbalance between west and east continued, in a similar fashion to that between north and south in England. However, even this middle was exposed to the vagaries of financial markets and an underperforming welfare state.
Moreover, the gap between the least well-off and the most well-off widened even further, so that alongside the new rich were the new poor, or underclass. Social mobility either declined or stalled in comparison with the 1960s—in particular, the capacity of the poorest parents to send their children to university.
Levels of poverty among children continued to be high. The reborn postindustrial cities of the north and Midlands, such as Manchester, came to symbolize much of the new Britain, with their mixture of revitalized city centres and deprived city perimeters that were home to the new poor.
However, as had long been the case, the economic centre of the country remained in London and the southeast. Britain thus became a prosperous but increasingly unequal and divided society.
Family and gender
After World War I there was a further decline in the birth rate and a continuing spread of contraception, though contraceptive methods had been known and practiced by all sections of society for a considerable time before this. What was important in the interwar years was a development of contraceptive practices within marriage.
The gradual spread and acceptance of “family planning” was also important; however, this acceptance was not usually seen in terms of women’s rights. The birth rate continued to fall through the interwar years, and in the 1920s the two-child pattern of marriage was becoming established. With it came the “nuclear family” structure that was to be characteristic of much of the 20th century, with households predominantly made up of
two parents with children who on achieving adulthood will leave the home to establish similar families themselves. Nonetheless, as always, there was considerable variation in practice. Coresident kin and lodgers were still found, particularly in working-class households,
where overcrowding was often marked, as it was in London after the disruptions of World War II. There was also a concentration on childbirth within the early years of marriage, as well as longer life expectancy for children themselves.
Marriage was thus becoming a different kind of institution, at once more intimate and private, as well as an arena in which individual self-expression was becoming more possible than previously. In many respects, the privacy that was possible for the better-off in society in the mid- and late 19th century became increasingly possible for those less well-off in the course of the 20th century.
However, the privacy that new kinds of family life and new economic possibilities made possible for poorer people differed from middle-class privacy. It was concerned with securing order and control of people’s lives in economic conditions that were still often difficult. As a result, “working-class respectability” differed from the respectability evident further up the social scale.
For instance, privacy was evident in the slowly increasing possibility of separate rooms for separate functions (kitchens, sculleries, and bathrooms, for example) and the development of more-private sleeping arrangements. However, the respectability of this private life was also public in that it was on show to neighbours as a living proof of the family’s capacity to create order in difficult lives:
the elaborately presented front of the house and the purposefully opened curtains of the “best room” of the home displayed the carefully presented if precarious affluence of the family.
Nonetheless, despite material and cultural class differences, there was a convergence across the social spectrum upon an increasingly common privatized and nucleated family life. This was part of a much more homogeneous life course and set of life experiences, which made the population increasingly uniform, at least compared with that of the 19th century.
Age at marriage, the experience of marriage itself and of running one’s own household, household size, and the similarity of the age at which major life-cycle transitions occurred all tended to produce more cultural uniformity than previously; this increasing uniformity was of vast importance for the new consumer and media industries, not to mention the political parties.
The political culture was in fact transformed from one based on class to a new sort of populist, demotic politics, shaped at least as much by the mass media, especially the popular press, as by the politicians.
The greater individualism possible within this more-privatized form of marriage received expression in the growing incidence of divorce, even as marriage itself grew greatly in importance in the 20th century. By the 1970s almost every adult female married at least once, though this figure fell considerably beginning in the 1980s.
By 1997 one-third of births occurred to parents not formally married; however, more than half of these were to parents residing at the same address. The phenomena of one-parent families, as well as of stable unmarried cohabitation, now became widely apparent. If people married more often, they divorced more frequently too, so that by the 1980s marriage disruption rates by divorce were equal to those caused by death in the 19th century.
By this time approximately one out of three marriages ended in divorce. These changes were of profound significance for politics in that they became linked in the public and political mind to the phenomena of antisocial behaviour by youth. Although this link was in reality complex, it did not stop the Blair administration from pursuing a “respect” agenda, which was designed to restore an at least partly imagined former era of civic virtue and public order.
The ill-fated ASBO (Anti-Social Behaviour Order), restricting the movement of offenders, was celebrated by some as an appropriately strong response to troublemaking neighbours and gangs but was condemned by others as an attack on civil liberties.
Of course, these social changes also greatly affected the understanding of women’s role in society. They were complemented by the growth of women’s employment, particularly in part-time jobs and most notably in the service sector, so that after 1945 a different life cycle for women evolved that included the return to work after childbirth.
These changes did not result in the equality of earnings, however; for example, despite the Sex Discrimination Act of 1975, under which the Equal Opportunities Commission was established, women’s pay rates in the 1980s were only about two-thirds of those of men.
Still, higher education was increasingly opened to women from the 1960s, so that by 1980 they formed 40 percent of admissions to universities, although, as with male students, they were overwhelmingly from the higher social classes.
As part of the widespread movement toward greater liberalization in the 1960s, in part inspired by developments in the United States, women’s liberation also developed in Britain
In turn, that movement gave rise to a whole range of feminisms, some more radical than others but all aiming at the ingrained assumptions of male superiority in employment practices, in education, and in the understanding of family life itself. Intellectual life became increasingly characterized by an explicitly feminist analysis, which led to some fundamental rethinking in a whole range of academic disciplines, though resistance to this was strong.
Changes in patterns of employment challenged stereotyped distinctions between the breadwinner and the housewife, as well as stereotypical notions of life as a married couple being based upon a well-understood division of labour within the household. The phenomena of the “new man” developed, though his progeny of the 1990s, the “new lad,” was not quite what his father had expected.
Coined to describe what was in fact a reinvented, consumer-led version of a long-held and ingrained masculine worldview, “laddism” turned out to be a snazzier, more fashion-driven, and above all more unashamed version of the old devotion to “birds” (women), beer, and football
In terms of popular leisure, music hall declined in popularity in the second quarter of the 20th century, but it left its mark on much of British culture, not least on the motion picture, which hastened its demise, and on television, which followed its end. By 1914 there were 4,000 cinemas in Britain and about 400,000,000 admissions per year.
By 1934 this had more than doubled, and admissions continued to rise steadily to reach a peak of 1.6 billion in 1946. This was a particularly popular form of entertainment, especially among the working class: the lower down the social scale one was, the more likely one was to visit the cinema.
The suburban middle-class motion picture audience of the 1930s was important but remained a minority. It is difficult to exaggerate the dominance of the cinema as a form of entertainment. In 1950, out of over 1,500,000 admissions to forms of taxable entertainment (and this included horse racing and football matches), cinema made up more than 80 percent.
Hollywood films dominated, though until World War II there was a thriving British film industry. This domination continued after the war, although British cinema asserted itself powerfully from time to time; for instance, in the Social Realism of the 1960s, notably in the work of director Lindsay Anderson, and later in the films of Ken Loach and Mike Leigh.
Parallel to these artful dissections of British life were the less high-minded but extremely successful “Carry On” comedies, which drew on the music hall tradition.
Reading matter continued to be produced within Britain, above all in the form of the newspaper. The British are inveterate newspaper readers, and there was mass consumption of a nationally based daily and Sunday newspaper press as early as the 1920s. This did much to create cultural uniformity, although, as with motion pictures, there were considerable differences of taste and preference regarding newspapers.
However, after 1950 the emphasis on uniformity became more marked and was reinforced by the progressive concentration of ownership in the hands of a few proprietors. This circle of ownership became even smaller as time went on, so that at the beginning of the 21st century the empire of the most powerful of these media moguls, Rupert Murdoch, not only dominated much of the popular press and made considerable inroads into the so-called quality press in Britain but was also international in scope.
Newspapers, however, were but one component of Murdoch’s and similar empires. The revolution wrought by new information technologies put control of a wide variety of communication forms, most importantly television, in the hands of these powerful individuals. Their political influence swelled as politicians of all persuasions were compelled to accommodate their power and, in a form of spin, play their version of the political game.
The development of a national mass culture seen in the previous period, in which the distinction between “popular” and “high” culture, if still important, was to some extent bridged, was to continue into the 20th and 21st centuries. (Cultural homogeneity was also intensified by increasing social and lifestyle uniformity.)
To a considerable extent, from the 1960s, all culture became popular culture, so that differences of gender, class, and ethnicity became if not merged then renegotiated in terms of a mass, “shared” culture. In this process, the older class differences were eroded, in line with other changes in class structure, particularly in the manual working class. At the same time, new differences and solidarities also emerged, particularly around age and levels of consumption.
Popular music—or pop music, as it came to be called from the 1960s—became an important area in which identities were formed. Pop has modulated through many forms since the 1960s, from the punk of the late ’70s and early ’80s to hip-hop and the rave culture of the ’90s, and distinct styles of life have accreted around these musical forms, not only for the youth.
The development of a uniform popular culture, at least as expressed through popular music, was greatly beholden to similar developments in the United States, where social identities were explored and developed in terms of black popular music, not just by African Americans but also by young white Americans.
Given the great importance of Afro-Caribbean immigration into Britain after 1945, and latterly south Asian immigration, the experience of ethnic minorities in Britain to some degree also paralleled that of the United States.
Concerns about national identity, as well as personal and group identity, became more important as Britain became a multicultural society and as the growth of European integration and economic globalization increasingly called British—and English, Welsh, and Scottish—identity into question.
The liberalization of the 1960s appears to have been crucial for many of these changes, with shifting gender roles being only one part of a broader international agenda. The civil rights movement in Ireland, student protest, and the anti-Vietnam War and civil rights movements in the United States were all part of the assault on the still-strong vestiges of Victorianism in British society, as well as, more immediately, a reaction against the austerity of postwar Britain.
Change in family life and sexual mores was represented in the 1960s by a range of legislative developments: the Abortion Act of 1967; the Sexual Offences Act of 1967, partially decriminalizing homosexual activity; the 1969 Divorce Reform Act; and the abolition of theatre censorship in 1968.
Moreover, debate concerning sexual mores continued in Britain throughout the 20th century and into the 21st, not least regarding the ongoing attempts to change the legal age of consent and the controversial Section 28 Amendment to the Local Government Act in 1988, which prohibited local authorities from promoting homosexuality.
Legislation enacted by Parliament in 2004, however, made same-sex civil partnerships (civil unions) legal throughout the United Kingdom by the next year, and in July 2013 Parliament legalized marriage for same-sex couples in England and Wales.
While that law generally allowed religious groups to opt in to performing same-sex marriages, prohibitions against same-sex marriage in the Church of England and the Church in Wales remained in force.
Change was also based on the relative economic affluence of the late 1950s and ’60s. The disintegration of older values (including middle class values) was evident in the “rediscovery” of the working class, in which films, novels, plays, and academic works depicted working-class life with unparalleled realism and unparalleled sympathy (including the works of the Angry Young Men).
The working class was therefore brought into the cultural mainstream. This was ironic at a time when working-class communities were in fact being broken apart by slum clearance and the relocation of populations away from the geographical locations of their traditional culture.
Changes in higher education, with the development of the polytechnics and the “new universities,” meant that, at least to some extent, higher education was thrown open to children from poorer homes. There was also the liberalization of educational methods in primary and secondary education, along with the emergence of comprehensive schooling, ending the old distinction between the secondary modern and the grammar schools.
In practice, many of the old divisions continued and, indeed, increased. However, rather than being accompanied by increasing cultural divisions, the opposite was the case. There was a much more positive understanding of the “popular” than before.
A more fluid, open, and commercial popular culture was signalled by the development in the 1950s of commercial television and, with it, the slow decline of the public broadcasting, public service ethic of the BBC.
With the explosion of new channels of communication in the 2000s, particularly in television, there was a noted “dumbing down” of all media, which was especially evident in the celebrity culture of the new century and not unique to the United Kingdom. The new television gorged on this, as well as on reality programming and on the enormously increased popularity of professional football.
These brought all classes together in a new demotic culture, although at the same time differentiation according to income, taste, and education became increasingly possible because of the technologies of the new media.
The various lifestyles associated with different genres of popular music are one telling indication of the way that lifestyle can determine an individual’s identity in modern society. This development reflects the withdrawal of the state from the direct intervention in social life that was so characteristic of the third quarter of the 20th century.
The state’s turn to the market as a model of government has been reproduced in terms of the market’s direct role in the formation of cultural life, so that the relationship between public culture and consumer capitalism has been close, in many ways the one constantly trying to outguess the other.
This game of one-upmanship, marked by ironic knowingness, has been labelled “postmodern.” However, this term has come to describe much of late 20th- and early 21st-century international culture and society, not only in Britain. It points to the growing understanding of the relative nature of truth, itself a reaction against the prevailing supposedly “modern” certainties of the 20th century (reason, freedom, humanity, and truth itself), which indeed have often had an appalling outcome.
However, it was a sign of the times that these antifundamentalist currents, themselves critical of much of Western culture, emerged at much the same time as new fundamentalisms emerged in the forms of American neoconservatism and certain strains of radical Islam.
The ferment of intellectual and cultural changes involved was inextricable from the massive changes under way in the transition to the novel forms of society made possible by new information technologies
Sovereigns of Britain
Athelstan was king of Wessex and the first king of all England. 2James VI of Scotland became also James I of England in 1603. Upon accession to the English throne, he styled himself "King of Great Britain" and was so proclaimed. Legally, however, he and his successors held separate English and Scottish kingships until the Act of Union of 1707, when the two kingdoms were united as the Kingdom of Great Britain.
3The United Kingdom was formed on January 1, 1801, with the union of Great Britain and Ireland. After 1801 George III was styled "King of the United Kingdom of Great Britain and Ireland."
4Oliver and Richard Cromwell served as lords protector of England, Scotland, and Ireland during the republican Commonwealth. 5William and Mary, as husband and wife, reigned jointly until Mary's death in 1694. William then reigned alone until his own death in 1702.
6George IV was regent from February 5, 1811. 7In 1917, during World War I, George V changed the name of his house from Saxe-Coburg-Gotha to Windsor. 8Edward VIII succeeded upon the death of his father, George V, on January 20, 1936, but abdicated on December 11, 1936, before coronation.
M I Ro