Wednesday 14 November 2012

The Marketisation of Higher Education: A lesson from the seminar room

Update: The Times Higher Education asked me to do a shorter version of this post for their comments section.  It is here: http://www.timeshighereducation.co.uk/comment/opinion/employability-agenda-isnt-working/2002639.article#.UUrF79iMf6g.twitter

The original post....

The other day, Howard Hotson, an Oxford history professor and steering committee chair of the Council for the Defence of British Universities, wrote in the Guardian about the dangers of marketisation in Higher Education. http://www.guardian.co.uk/commentisfree/2012/nov/11/universities-great-risk-we-must-defend-them?fb=native&CMP=FBCNETTXT9038

Professor Hotson's particular point here was about business management models being applied in academe, but as he and others, including the Council for the Defence of British Universities, have said, marketisation represents a more general threat to“fundamental academic principles and the purpose of higher education itself.”  Indeed I had an experience this very morning of how that threat works where it matters even more than in management—in the classroom.  My department does a first-year course called Making History that aims first and foremost to teach students about the complexities of researching and writing history. We also use it to teach transferable skills such as how to write and structure essays, how to present written work, how to reference properly, and so on, at this handily early stage in their student careers.  And as of this year we are also using it as a means to integrate “employability” into our curriculum. My colleagues and I co-teach much of this course in seminars, and today’s seminar was devoted to CVs. 

On the face of it, there may seem nothing wrong with any of this. Teaching the problems of historical analysis is so obviously a part of our remit as history teachers as to require no further comment. Teaching how to write, how to communicate historical and indeed any kind of knowledge, is similarly unquestionably good. So is turning out people with degrees who are equipped for the job market. Indeed, through my 20 years as an academic I’ve always taken my responsibilities in this regard very seriously and worked hard to prepare students to be able to make their livings, as well as teaching them the critical thinking skills that are an essential part of undergraduate education. There is not, needn’t be, and shouldn’t be a zero-sum game or a polarised debate about whether my role and that of academic teachers is about training students in critical thinking and/or (should be and) about training students in the knowledge and skills they need to find a place in and be useful in the world of work after they graduate. We can and should do both. So what am I worried about?      

I’m worried because even though there need be no zero-sum game between the aforementioned ends of a university education, and indeed one might argue that the best employees as well as the best citizens are ones with free and critical minds, problems nevertheless arise when the proponents of one agenda attempt to diminish the other. And this morning’s class was for me in microcosmic form an example of how the “employability” agenda is diminishing the imperative of teaching critical thinking. Here’s how. 

“Employability” is now indeed explicitly an agenda, and it is an agenda driven in very particular ways by very specific interest groups. Peter Mandelson, the high priest of New Labour neo-liberalism, moved responsibility for Universities from the Department of Education to the Department of Business, Innovation, and Skills, where it remains under the charge of Vince Cable, a man who once preached against economic neo-liberalism but whose words turned out to be about as reliable as a Liberal Democrat manifesto pledge. Our masters at the Department of Business, Innovation, and Skills have since made it no secret that they want HE to focus more on employability, on preparing students for the world of work. This kind of thing used to be the task of the Careers Office in combination with academics in their capacities as Personal Tutors. There it remains, though that aspect of Personal Tutoring is now called Personal Development; a clear linguistic signal about who is setting the agenda and what kind of agenda they’re setting. But the newest development in this clearly creeping process is that in parts of the students’ timetables where we used to teach them history (and you an insert of any other discipline here to the very same effect), we are now expected teach things like CVs.  Employability at my university has now therefore crossed the boundary from extra-curricular to curricular activity. It may have done so already at other institutions; it will no doubt do so later at yet more. To ensure that we do this, by the way, “employability,” measured in terms of what proportion of a university’s students are in full-time employment six months after graduation, features highly in university league tables—another weapon taken from the bag of sticks used to beat universities into submission to the ethics and practices of the market. University managers feel they have no choice but to do the bidding of business and government or else go “out of business.”        

Is this really that bad? you may be asking. It’s just one seminar out of ten; surely not too much time to take from the academic syllabus for the sake of employability. My first answer is that, okay, it’s one of ten seminars—for now. Far more revealing, however, is what I was required to do in this one class. As noted above, the employability agenda in general is being driven by business interests operating though the government which pressures university managers. The origins and intentions of the agenda, though, very much show though in the way “employability” is required to be presented in the classroom. For each seminar each tutor on the Making History (now Making History History?) course is presented with a “task” for him or her and the students to do that week. Here, in full, is the one I was given for this morning’s seminar (my co-teaching colleagues got the same one).

“Seminar 7

For this task you need to prepare three things:

  • a CV
  • a paragraph identifying its weaknesses
  • an action plan for how you are going to address these weaknesses
Guidance on all this can be found in the employability section of the Information for History and Students site on Blackboard.”

As this inidcates, the task is not really just about the perfectly laudable aim of helping students prepare to find jobs.  It’s about much, much more than that.  It's about turning them into cogs in the corporate machine. Let’s go through each of the things the students had to prepare, one-by-one, to see more fully what they’re about.

First, the students were required to present their CVs in front of each other, inviting them to think of each other as competitors. Of course one day they will be competitors for individual jobs in the market place. But it is not necessary to make students think of themselves as employment rivals in the first semester of their first year in university, except perhaps if the aim is to inure them to the notion that they are competitors and to normalise that perception of themselves and of the world, to make the rat race seem the right and only way for the human race. 

Next, as above, the students were asked to identify their weaknesses as potential employees. Again, they’re first years in a history class. Why do that this early? Here's why.  Another piece of modern corporate cant is that workers need to be flexible and adaptable to the needs of business--as opposed to businesses being flexible towards the needs of individuals, communities, countries (paying their taxes?), the environment, etc.  Once again, then, it’s about inuring students early to the first imperative, making them think that they have weaknesses, they are the problem, they need to adapt, the underlying assumption of all of which is that they serve the world of the business more than the world of business serves them and the rest of us.  

The third thing they had to prepare would have the effect of practical implementation of that notion: that they need to serve business and business does not need to adapt or serve them. Not just ideological co-option then, but the beginnings of nbehavioural co-option, not just indoctrination, but actual preparation for collaboration.  It had the added intent perhaps of innoculating them to the language of the modern boardroom, of “action plans” in this instance. How's that for "thinking outside the box"? 

The effect of the whole, especially if given sanction by us as a History and Classics Department when we include it in our syllabi and on our information sites, which, as above, it emphatically is, is to promote the notion that all this is alright, a natural order of things, or at least that it's inevitable and something that students must accept. Not something they should question with critical, thinking minds. Certainly not something they should complain or protest about.  
 
As it was, I chose as far as I was able not to let the above happen to its full potential effect. Instead, first, I felt it was unconscionable to require students to present their CVs in front of other students. True, CVs are essentially public documents that are handed over eventually to individuals or committees hiring workers. But I felt it was wrong to make them hand these documents over for perusal in a classroom.  And not just because it would encourage students to think of each other as economic competitors at a time in their lives when they should be thinking of each other as members of an academic learning community.  But also because some might feel uncomfortable or even perhaps humiliated by the process of comparing their records with that of others in this way, and wholly unnecessarily so at this early stage of what we still sometimes call their academic careers—especially when they’re asked to identify and establish “action plans” to deal with their “weaknesses.” They should not feel or be made to feel either uncomfortable or humiliated, and they wouldn’t be if we gave them the message that right now they’re students and shouldn’t feel obligated to be job-market-ready just yet, and that it’s okay at this stage in this stage on their lives if they have other interests and priorities. But we’re not giving them that message. By allowing the “employability” agenda directly into the classroom in the form of presenting CVs, making them dwell on and fix their "weaknesses," their unfitness for purpose, we are telling them instead that their worthiness as students depends on their job-readiness and on their acceptance of what they are told that requires. So, rather than force that on them, I gave my students today a choice of showing their CVs, identifying their weaknesses, and making their action plans--or not doing so. I felt I had to let some do it if they wanted to in order to maintain equal opportunity with those students in other classes who were doing it.  But I gave them the choice not to do it.

I also decided I wouldn’t let the classroom time go by without asking them to discuss the above issues, allowing them the opportunity to do some critical thinking, to reflect on their learning and take a measure of ownership of it—concepts that are not as in vogue as they were before business interests began determining educational priorities more directly.  In that spirit I assured them, as I always do, that they must make their own minds up and argue for themselves—not follow my lead.  I should no more try to indoctrinate them than business interests and governments should. The subsequent discussion revealed a wide range of opinion (though no reductive polarisation)—a very healthy thing among intelligent, independent-minded young people, who, I was thereby given encouragement to believe, will not be easily brainwashed by anyone.

A final point. Another aspect of market-based thinking that’s always thrown at us in support of the marketisation of Higher Education is the cant of consumerism.  And the supposition is that, as "consumers of the higher education experience" (Peter Mandelson), students' top priority is job training.  This is a false proposition, first because it presupposes that students are first and foremost consumers, although that's a compelling proposition when you’ve laid the groundwork for it by charging students up to £9,000 a year for a university education.  It’s also a weird one, though, when we’re so often presented with the notion that there’s no alternative to the marketisation of Higher Education and indeed of just about everything else—that is, there is no choice but "choice". And it's demonstrably false for another reason. It was interesting that this morning five out ten students opted not to engage in the CV task, and ten out of ten thought that employability education and advice should be left to trained experts in dedicated careers offices and should have no place in a history course. Another five who are registered in the group chose not to turn up for the CV task at all. I’m required to report them for their absences. But I’m choosing not to.



Monday 5 November 2012

Remember, remember the fifth of November; or, understanding some aspects of the historical origins of the US Presidency; or, why Americans are roundheads

Newly reposted and slightly revised for 5 November 2014, as many US voters once again proved once again how very different they are.

This blog post was originally intended as a serious-ish discussion of aspects of British and American constitutionalism that help explain some aspects of the current US Presidential election, aspects deriving from the Glorious Revolution of 1688. The tenuous-ish link to November 5 has to do with William of Orange landing at Torbay on that day, 324 years ago. As is my way, though, I got diverted for several hundred words into a catastrophically meandering preamble replete with the usual desperate, attention-seeking lengthy humorous asides, or non-humorous asides, depending on suxh things as your point of view, and age. So, if you want to skip all that and get to the serious bit, scroll down until you see the words, in bold, This is the serious bit.  
As children, my fellow undergrown gumps and I were often told by our teachers and our televisions to “remember, remember the fifth of November.”  Sometimes these authorities invoked this phrase as parts of health and safety lectures about not holding lit fireworks and thereby ending up with a blackened lump of burning bone and smouldering skin where your hand used to be, although even back then I thought that wouldn’t be such a bad thing in some cases, for the greater good, you know, the Darwinian betterment of the human race and all that.  This may be harsh, but it is an important consideration.  Imagine if lizards made an evolutionary comeback. The two-legged stand-up ones (who were always the dangerous ones, compared to those fat, four-legged, leaf-eating dumbosaurs) would have major evolutionary advantages over us, accustomed as they would already be to performing daily tasks with naturally short arms and stumpy paws. That’s where namby-pamby health and safety will get us once global warming returns us to the most elemental struggle for survival—defeated and eaten by dinosaurs. 
Anyway, I digress.  We were also taught that the fifth of November is about Guy Fawkes’s attempt to blow up the Houses of Parliament in 1605. “Remember, remember the fifth of November” comes from the following, or some variant of it:      
Remember, remember!
The fifth of November,
The Gunpowder treason and plot;
I know of no reason
Why the Gunpowder treason
Should ever be forgot!
Guy Fawkes and his companions
Did the scheme contrive,
To blow the King and Parliament
All up alive.
Threescore barrels, laid below,
To prove old England's overthrow.
But, by God's providence, him they catch....
Fair enough, an important historical moment to be sure, and it should be remembered.  But it shouldn’t and doesn’t have to be and often isn’t remembered to the exclusion of other fifths of November.  We’ve already seen that our future Giant Lizard Kings will remember the fifth of November as a crucial factor in their eventual ascendancy over us.  But there are other fifths of November to remember too. One that used to be universally remembered in the Anglo-dominated world is the fifth of November 1688, the day that William of Orange landed with his forces at Torbay, in Devon, on his way to overthrow James II and claim the throne for himself as King William III and his wife as Queen Mary II.
Now this event, the Glorious Revolution as it came to be called, can, of course, be remembered in different ways for different things. The English might remember it for its restoration of Parliamentary and other liberties from the prerogative-grabbing claws of William’s father-in-law James, the father of Mary, a legacy perhaps best encapsulated in the Declaration and then the Bill of Rights of 1689.  It often isn’t remembered this way, however, or indeed at all, as I pointed out somewhat shirtily in a previous post about a radio programme positing that Britain should perhaps adopt a Bill of Rights, perhaps based on the US one (we already have one, from 1689, and the US one is partly based on it):  http://stevesarson.blogspot.co.uk/2011/10/britain-already-has-bill-of-rights-yes.html
Alternatively, and equally validly, Catholics might remember the fifth of November 1688 and what followed it as the beginning of a century and a half of political and economic disfranchisement and social segregation. The Scottish might and perhaps more often do remember it as a moment in the aggrandisement of Parliamentary authority that led to the kind of imperialism that saw their parliament abolished in 1707, not to be restored until almost three centuries later. The Irish might and certainly far more often do remember it for James's final defeat by William at the 1690 Battle of the Boyne, and what followed—another moment in the on-going oppression of the Irish and of Catholics. Patriotic English people used to remember it this way too, only in a rather more celebratory way, and indeed the fifth of November often used to be called Pope’s Day, and people would burn effigies of the pontiff rather than of Guy Fawkes. The Protestant Irish still see it in something like these terms, and to this day they march about in traditional seventeenth-century costumes, such as bowler hats of the exact same kind that King William and Queen Mary used to wear.
Whatever the case, for all the Glorious Revolution's ambiguities as an event, I think it’s a shame that it isn’t as well remembered as a moment in the history of liberty as well as other things as it might be. It certainly used to be. When the American lawyer and legislator John Dickinson wrote his famous (and deviously misnamed) Letters from a Farmer in Pennsylvania, objecting to British taxes in the form of the 1767 Townshend Duties, and to Parliamentary presumption of an entitlement to rule over the American colonies, as expressed in those taxes and in the Declaratory Act of the previous year that proclaimed Parliament’s right to legislate for the colonies “in all cases whatsoever,” he published the first of the letters on the fifth of November. And everyone knew what that meant—Dickinson was performing a symbolic insurgency in favour of liberty analogous to the invasion of England by William of Orange, the cheeky monkey. 
This is the serious bit.
[Fellow historians and, for that matter, political scientists: please see * below for a quite possibly unnecessary and vainglorious note on intellectual property.]
So, anyway, why, given all the different ways to remember the fifth of November, am I writing about it here today? It’s because, as it happens, this year, the fifth of November falls on the day before the day of the US Presidential election. What possible bearing can the fifth of November 1688 have on the 2012 contest for the occupancy of the most powerful address on earth, 1600 Pennsylvania Avenue, the White House? Well, quite a lot, actually, and here’s why.
As many historians of colonial America have said, in some ways the Glorious Revolution in the colonies was similar to the one in England. Yet in some crucial respects, its settlement worked out very differently on either side of the Atlantic Ocean. In the run-up to 1688, colonists felt that their rights as “freeborn Englishmen” were being violated by James II or his minions much as Englishmen back home felt theirs were. The need for actual revolutions in most colonies, though, was obviated when colonial governors declared allegiance to Parliament and to William and Mary once news reached America of James’s overthrow. Two of the three exceptions prove the rule. In the colonies of Massachusetts and New York, local revolutions overthrew Governor Edmund Andros in Boston and Lieutenant-Governor Francis Nicholson in New York city, who were agents of James II’s direct rule of the Dominion of New England. When that that regime collapsed in those places, it also collapsed in the other Dominion colonies of Connecticut, New Hampshire, Plymouth, Rhode Island, and New Jersey. Things were slightly more complicated in Maryland, a proprietary colony owned by the Lords Baltimore, the Calverts, who were Catholics, bringing a religious dimension to their overthrow in early 1689 by a Protestant Association, making that local revolution even more resonant of events back in England. (The Calverts got their proprietary back when Benedict Leonard Calvert, the fourth Baron Baltimore, converted to Anglicanism in 1715.)  
As the above indicates, there were local contingencies that made the Glorious Revolution, or revolutions, in America sometimes somewhat different from what transpired in England, but it’s how the Glorious Revolution’s settlement panned out over the long term that really explains some of the differences in British and American politics since, including the nature of the executive arms of government and the relationships of executives to legislatures on this side of the Atlantic and that.  
In England, desperate as politicians were to be rid of the Papist tyrant James, as they saw him, and to restore what they saw as England’s ancient constitution, they were also desperate to avoid falling back into the darks days of the 1640s, back into civil war in which one-in-ten of the population died, and into God-only-knew what kind of cosmic retribution another regicide might invite.  Men as diverse as radical Whig believers in popular sovereignty, moderate Whig and Tory believers in Parliamentary sovereignty, and High Tory believers in the Divine Right of Kings therefore worked together on a compromise that all could peaceably live with. They thus agreed that James had “abdicated” rather than been forcibly overthrown. That allowed them to believe that the Declaration and Bill of Rights represented the work of the people, or of Parliament, or of God, as ideological preference directed. Eventually, they could all believe in the “principle of co-ordination” whereby sovereignty, whether it originated in the people, Parliament, or the crown, in practical terms worked through the operations of the crown-in-parliament. By the 1720s, the time of the premiership of Robert Walpole, the British parliamentary system worked much as it does today. The leader of the majority party in the House of Commons becomes the King’s or Queen’s First or Prime Minister, the head of an executive that in the nature of the system controls the majority in the legislature. (Or, in the event of a hung Parliament, the leader of the party with the largest minority can form a coalition with the leader and members of a third party to form a working majority in the House of Commons, especially if members of the third party are a sufficiently unprincipled bunch of shit-weasels who, for the sake of experiencing chimerical power, are willing to betray every principle they promised during the foregoing election to uphold.)
In America, however, or, rather, in the various Americas, it didn’t turn out this way at all. Royal governors were just that: royal governors who derived their authority from the imperial centre, whether formally from the Privy Council and thus the crown, or in practice from government ministries, or indeed from the complicated but co-ordinated operations of the crown-in-parliament. Governors could not co-ordinate with provincial legislatures in the way the crown could and did with Parliament in London, as that would have put them at odds with their metropolitan masters. Even when they did what local legislative assemblies told them to do, as indeed many had little choice but to do, they could not institutionalise that kind of executive-legislative co-ordination, and ended up politically ineffective from an imperial-interest point of view. When they were active, the implementation of imperial-executive will necessitated such actions as proroguing and even dissolving colonial legislative assemblies, vetoing their legislation, and exercising other traditional executive actions such as creating courts and dismissing politically disagreeable judges—the kinds of prerogative powers that were either explicitly outlawed by the Bill of Rights or else obviated by the operations of co-ordination within England (or, after the 1707 Act of Union, Britain). It was this kind of executive prerogative that Parliament adopted towards the colonies from the passing of the Sugar Act in 1764, ultimately leading to American independence in 1776, which the British were forced to recognise in the Treaty of Paris of 1783. 
The Americans thus never developed the tradition of co-ordination that defines the British system of parliamentary democracy. They thus to this day retain a degree of separated rather than mixed powers in their constitution. Thus it is that the American President is elected every four years singly and separately from members of Congress who are elected every two years in the case of Congressmen and Congresswomen (members of the House of Representatives) and every six years in the case of Senators. Thus it is that there can be and often is a President of one political party and either a House of Representative or a Senate or indeed a whole Congress dominated by another. Far from controlling the legislature, as is inherently the case in the British system, the American executive often finds himself the opposition to it.  
It’s often said that the US constitution is new, or at least much newer than the ancient constitution of Great Britain. I’d contend that at least in some respects the opposite is the case. The modern British constitution is very much the product of 1688-89, a 1688-89 that never happened in America. The US Constitution reflects (and so do all the state constitutions with their independent governors) an older oppositionism between executive and legislature that pre-dates the principle of co-ordination that emerged from the Glorious Revolution in England, and that indeed was such a divisive feature of the English civil war era and in fact much further back into the English past. And what’s true of American political constitutionalism may also consequently be true of American political culture. If the moderation of modern British politics is a product of the co-ordinated nature of the British parliamentary system (as opposed to being somehow inherent in British character, as some ridiculous people would have you believe), then perhaps the sometimes “paranoid style” of American politics (as it has been termed by American political historians) is a product of an inherited roundhead tradition of legislative opposition to the ever-present danger of tyranny that they believe executive power inherently represents. 
So, what I’m saying is, if you think American politics is a bit weird and wacky, remember, remember the fifth of November (1688).     

[* Academic friends: I'm intending to develop and publish these ideas one day, so if for any reason you are enough of a poppet to consider them worthy of mention, then I'd appreciate a citation. Ta.]
                                              

Wednesday 31 October 2012

Disney & the Future of Star Wars; or, how about a Sci-Fi Mash-Up with Star Trek, Dr. Who, & The Hitchhiker’s Guide to the Galaxy?

Update: 4 May 2020. None of the stories below was commissioned by Disney and will not be appearing at a cinema near you any time soon. Nevertheless, May the Fourth be with you.
   

The recently announced [October 2012] Disney takeover of George Lucas’s Star Wars franchise is causing a great deal of grief and gnashing of teeth.  Aficionados of the Sci-Fi classic seem concerned that the promised new films will somehow be Disnified, perhaps with a regiment of Stormtroopers unrealistically defeated by Seven Dwarves, or the Millennium Falcon risibly outmanoeuvred by a flying elephant, or some other harbinger of the End of Days. But Star Wars lovers needn’t worry. There is surely enough cultural capital in the Vaderverse to sustain at least the promised Star Wars 7, 8, and 9 without having to adulterate these artworks with childish foolishness. And if the new makers are running out of ideas when they decide to story-board Star Wars 10 plus, there’s still no need for them to stoop to having a Wookie in a light-sabre-death-fight with Donald Duck, or, worse, Princess Leia being enslaved by a mouse. Instead, and much more appropriately, they can find interesting inter-textual synergies with other Sci-Fi franchises, such as Star Trek, Dr. Who, and The Hitchhiker’s Guide to the Galaxy.  Indeed, below are some possible plots for Star Wars 10, 11, and 12 based on exactly these potential scenarios.  And some guesses about how, in artistic and entrepreneurial terms, it might all pan out.     

Star Wars 10: “Space Seed”

The Kardashians and their evil leader, Kim, are planning to conquer the entire Galactic entertainment industry and, consequently, the political structures of the whole Milky Way. Luke and the gang are determined to stop them, but don’t have the military or, more importantly, the media resources to do it alone. They therefore team up with internationally well-connected Captain Jean-Luc Picard of the second iteration of the Starship Enterprise in order to defeat the dastardly scheme. In a dramatic twist, Captain Picard, immediately after ordering the final attack on the Kardashians’ Hollywood lair by bellowing “MAKE IT SHOOOOW” in an unnecessarily loud voice suited more to the requirements of live theatre than to film-making and television, is mauled to death by a Wookie who has become deranged with anger at the injustice of such an appalling ham taking a lead role while he, a superior actor, is invisible to the audience, hidden as he is in what he later describes to police psychologists as “a fucking wanking Bigfoot suit.”      

In a subplot, Princess Leia, inevitably sidelined when the real space-ship and laser-gun action narrative begins, goes on a week-long Pan-Galactic-Gargle-Blaster bender on Betelgeuse 5, until rescued by a semi-retired Captain James T. Kirk, who restores her to mental and physical health with his “Space Seed,” after seducing her with a cornographically humorous invitation back to his Space Hotel room to see his “Force.”



                    The Kardashians’ evil leader, Kim (right) and sibling henchperson Kourtney.


Star Wars 11: The League of Cybermen Strikes Back

The League of Cybermen, having failed to take over the Earth and the Milky Way, are planning to conquer a galaxy far, far away. Luke and the gang are determined to stop them, but don’t have the military or, more importantly, the time-travel facilities to do it alone. They therefore team up with Doctor Who in order to defeat the dastardly scheme. In a comedic twist, Han Solo cannot get the hang of flying the Tardis, and there are hilarious scenes of him stalling it, going round in circles, and landing it on its roof, until, in exasperation, he hands over the controls to a gigantic space bear. Solo blames the machinery, calling it a “goddamned un-aerodynamic bean tin.” The Doctor explains that because space is a vacuum there is in fact no need for aerodynamic shaping of space ships, and nor indeed for time-travel vehicles. This information offends Solo’s belief in a universe dominated by symbolically masculine geometries, so he punches The Doctor in the face.

In a subplot, C3PO becomes surprisingly aroused by the rearward attentions of K9, but the tinny cyber-romance scenes are panned by critics because of the young director’s ambitious but lamentably ill-judged attempts at an arty allegorical referencing of Blade Runner, including a confusing, unfeasible, and frankly tragic dual role-play by Harrison Ford.


Star Wars 12: So Long, and Thanks for All the Pish

The Chinese government and the warrior people of Krikkit are planning to conquer the galactic headquarters of the Brockian Ultra-Cricket Association in order to sell the broadcast rights to Sky TV, depriving terrestrial television viewers of free-to-air sports coverage. Luke and the gang are determined to stop them, but don’t have the military or, more importantly, the political connections to do it alone. They therefore team up with Arthur Dent, Ford Prefect, Trillian, and Galactic President Zaphod Beeblebrox in order to defeat the dastardly scheme. In a casting twist, after lobbying about negative cinematic representations of Asian Americans, the role of the insanely sadistic chief of the Chinese Secret Service is offered to Alan Rickman. In the film, Rickman holds the Dalai Lama and Yoda hostage, hoping to use their wisdom in dealing with the cunning and double-crossing Rupert Murdoch, who is also played by Alan Rickman. The day is ultimately saved by Darth Vader (who, following the death of Sir Alec Guinness, is played by Alan Rickman), when he turns away from The Dark Side after being told by his brutal bodyguard and sidekick, played by Sean Bean, that one simply does not associate with a man as evil as Rupert Murdoch. 

In a subplot, after many, many lifetimes of sexual abstinence, the Dalai Lama and Yoda find themselves strangely attracted to each other, and an ultimately unresolved homoerotic will-they-won’t-they scenario ensues. 

Sadly, Star Wars 12 bypasses cinemas and DVD release, going straight to Channel 5 and Fox TV, where its only viewers mistake it for a documentary. Disney decides to discontinue the franchise for the good of all humankind, and the producers and directors spend the rest of their lives in the Hollywood Hills giving poolside interviews to large young men with over-sized beards and small young men with over-sized spectacles for broadcast on their niche-audience You Tube channels.

          

Wednesday 10 October 2012

The Rhetoric of David Cameron’s Speech to the Conservative Party today at the Birmingham Party Conference; or, the Conservative PC Lies in Scameron’s Ansprache to the Lizard People today at the Nuremberg Rally.

The Tories, with their US Republican ideological soulmates, long-ago came up with the term “Political Correctness” to describe liberal-left attempts to find respectful language to refer to non-white people, LGBT people, disabled people, and so on.  Of course, the term itself is a kind of political correctness that allows them to demean these efforts to respect others, dismiss others' rights to self-representation, and which constructs such respect for and rights of others as oppression of their own freedom of speech. Or, put another way, their freedom to continue routinely (and without comic effect or political inversion) using words like darkies, benders, mongoes, spazzers, and so on.  There is indeed a whole lexicon of Conservative Political Correctness at work today.  I’ve written before (as have others) about how you can’t say “cuts” anymore; nowadays you have to say “efficiencies,” or at least I wrote about it when I saw Tory Not-Even-Slightly-Secret-Agent Nick Robinson using that whitewashing and factually misleading word: http://stevesarson.blogspot.co.uk/2012/09/a-long-tweet-to-nick-robinson.html.  Similarly, in ConPC-speak, the term “something for nothing culture” now applies to people who actually have nothing, rather than people who inherited large fortunes that their parents acquired by maximising them in tax-havens (or low-tax havens and no-tax havens as they might much more accurately be called).  And, of course, the word “not,” in the ConPC dicktionary* means “and break up, privatise, and thereby abolish”: as in the election pledge that “We’ll cut the deficit, not and break up, privatise, and thereby abolish the NHS.” (*Not a typo.) 

I’m writing this as David Cameron (David Scameron) is delivering his keynote speech (Ansprache) to the party faithful (fellow Lizard-People) at today's Conservative Party Conference (Nuremberg Rally).  There will be lots of ConPC (lies) in the Asprache, and in light (the obfuscatory darkness) of the litany of enormously damaging dishonesty I have decided to publish this brand new blogpost about ConPC rhetoric as it might be applied in a Tory version of a history textbook (reheated version of this:
http://stevesarson.blogspot.co.uk/2012/09/a-glossary-of-terms-for-tory-history.html with a new context and introduction and one extra joke: at the end below, if you've read this lazy rehash already and want to skip to the only new bit).  

Medieval Society. Early Big Society.

Feudalism. Social responsibility.

Serfdom. Internship.
The Peasants’ Revolt. Class hatred (orig. David Starkey).

Enclosure. Land management efficiencies (necessary and unavoidable, there is no alternative, and it's not at all driven by class-interest or ideology).

The English/British/Civil War/s/Wars of the Three Kingdoms (etc.) / The Interregnum / The Glorious Revolution. The Unfortunate Disruptions.

The Renaissance/Enlightenment/Scientific Revolution. The Birth of Capitalism (orig. Niall Ferguson).

The Birth of Capitalism. The Enlightenment / The Great Going Forward (orig. Niall Ferguson).  

Imperialism/colonialism. Global democratisation (orig. Niall Ferguson).

Empire. Free-trade zone (orig. Niall Ferguson).

The Atlantic Slave Trade. African Labour Recruitment System THAT WAS ABOLISHED BY THE BRITISH (orig. Simon Schama).   

Slavery. Free Labour.

France. South Dorsetshire.

Germany. Unser Vaterland.

World War I. The Unfortunate Incident.

World War II. The Even More Unfortunate Incident.

Europe. Northern Africa.

The United States. Daddy.

Margaret Thatcher. Mummy.

Chartism. (See Peasants’ Revolt.)

Suffragettes. Lesbians.

The Working Classes. The Help (orig. Lucy Worsley).

The Welfare State. The Failed Soviet Union (orig. Dominic Sandbrook).

The National Health Service. The Sixty-Year Mistake. The Medical Business Opportunity.

The Tabloid Press. Our Friends in the North.

Bankers. Masters of the Universe.

Boris. Person born and raised in single-parent household.



Wednesday 19 September 2012

A Glossary of Terms for a Tory History Textbook; or, how to teach freshers historiographical semiology

With the new term starting soon, my university campus is undergoing its annual replenishment with fresh-faced first-years (albeit fewer of them this year, for some reason) filled with kittenish excitement, scampering about their wondrous new world with their over-sized eyes and enormous ears. And some of them are historians. But most of those won’t previously have studied history as they will in university, and some of them won’t like the new way of approaching the subject as a subject, as opposed to the subject as a succession of events.  And they will, at least initially, express their alienation with various degrees of mewing and pooing.* (*Just seeing how far I can push this kitten metaphor.  And also I’ve just got a kitten.)

But history isn’t just a succession of events.  It’s many other things too, among which it’s an academic discipline.  That is, it’s a subject made by objects, namely historians.  Contrary to the appearance of some of us, historians are humans.  As humans we carry all kinds of memes, including consciously-held ideological principles (such as a belief in the Dawkinsian memetics), and, more trickily because they’re more likely to be held unconsciously, deeper ideational assumptions that undergird those ideological principles (such as a belief in the modern, western, scientific methods of enquiry that helped Richard Dawkins theorize the existence of memes).  Moreover, our ideological principles and ideational assumptions are reflected in the very words we use when crafting our reflections on our subject, again both consciously and unconsciously.  So it is that the articles and books historians write might well be well-meant attempts to reconstruct history as a subject, but they are also the productions of imperfect objects wittingly and unwittingly spunking their own intellectual DNA all over their work.  And to say this is not to spout some post-modern hocus pocus.  Some post-modernists may claim to have discovered subjectivity in historical writing, just as some 1960s hippies claimed they discovered sex, but it was the very pre-post-modern E. H. Carr who rightly reminded us that to study history you also have to study the historian.   

Put it all like that to the average fresher, however, and you are likely to be confronted by wide-eyed incredulity, first at the unfamiliar academic jargon, and, second, at the use of the word spunking.  So you have to break the neophytes in gently. In an attempt to do so, we have a freshers’ course here at Swansea called Making History.  Yes, “making” history.  That is, it’s about how we produce and present history (and by “we” I don't just mean academic historians in the traditonal sense but also librarians, TV producers, film makers, and anyone else who creates representations of the past).  And one of the things we try to introduce new students to on this course, to employ more technical terms than spunking, is the semiology of historical writing, the semiotics of individual historians, and indeed the subtextual subjectivities inherent in history as historiography.

To that effect, early in the course we make the students read a not-so-subtle article from the Daily Express that purports to reveal left-wing bias and consequent “dumbing-down”* in the teaching of history in today’s politically-correct Britain. (*“Dumbing down” is an almost unfathomably complex historical phenomenon that only its most sophisticated theorists are capable of understanding.  It is in no way a concept of such vacuity that its serious usage is so staggeringly and yet unintentionally ironic that it would make your brain melt if you dwelt on it for too long.)  Reading and discussing this article is really a morbidly fascinating excercise in deconstructing double bluff in a text that claims to expose bias but which itself is biased to the point of the most fantastical inventiveness. Equally interesting is analysing the process by which the distortion happens, attempting to determine whether it lies solely in the mental retardation and moral depravity of Richard Desmond, or whether it just panders to its swivel-eyed, frothy-mouthed, hairy-nostrilled, Diana-obsessed saloon-bar-boor readership, or whether these producers and consumers conspire in a lie-reinforcing loop of crypto-fascist philosophising and neo-Jeremiacal prophesying.

Anyway, I really wanted to have a pop at this lamentable lamentation right here in my blog, which I could then show to students. But then I realised that doing so would give too much away before the actual classroom exercises, effectively spoonfeeding, when another of the points of university learning is that a lot of it should be auto-didactic.  And that should certainly remain the case even in the future of high fees, or else we will be short-changing students on their £9,000 annual investments (and hopefully university senior managers will maintain this ideal against any countervailing pressures, rather than indulging in unprincipled, careerist caving to the powers-that-be).  Also, it’s possible that in the above I might already have given students the tiniest hint of an orientation towards the beginnings of an interpretative lead, which would be very bad indeed.  However, all is not lost.  Listening to certain Collaboration politicians spouting a certain word repeatedly in recent times, and hearing the Conservative Party Political Editor Nick Robinson parrot that word, I began to form an idea about how to teach new students about semiology. 

The word is “efficiencies.” What the word actually means is cuts, specifically cuts in public services (I add that qualification, by the way, in case you thought I meant cuts in bankers’ bonuses perhaps, or cuts in the tax loopholes used by Tory Party donors, which I didn’t, because the word cuts does not apply in any way in these areas).  Anyway, the thing about the term “efficiencies” is that it is meant to signify something while hiding something else, in this instance creating an illusion of cuts in public-service funding that occur without cuts taking place in actual public service provision. These sorts of linguistic gynmastics can also distort the reality of things to the point of suggesting the perfect opposite of their effects, in this instance when cuts actually lead to inefficiencies. Like when “efficiencies” mean that institutions can’t afford to replace ITC equipment, for example, so softwear systems crash, hardware packs up, and so everything takes ten times longer than it needs to, if it ever gets done at all. I pointed this out, using specific examples, to the Prime Minister’s spokesman/woman/person/thing/gimp Nick Robinson in a series of tweets that I then blogged here: http://stevesarson.blogspot.co.uk/2012/09/a-long-tweet-to-nick-robinson.html. He hasn't replied or in any way thanked me.

This of course is just one of a panoply of words and terms used by the modern master of mendacity David Cameron (post-Bliar, anyway—poor old Gordon Brown was just too tired and/or mad to still be able to lie convincingly by the time he finally became Prime Minister, and therefore lost the last election).  Others include such words as “equality” and “fairness,” slithering easily off the Old Etonian’s tongue even as he knowingly orients economic structures and social institutions even more to the benefit of the rich and to the detriment of the poor than was the case already.  And then there’s what is apparently the Prime Minister's favourite term: Big Society—a term that even Camerrhoids don’t seem fully to understand, but which, as far as I can tell, implies a return to the good old days of Victorian philanthropy, which, not coincidentally, requires the even more conspicuously unmentioned actuality of the bad old days of Victorian poverty.

Which brings me from words back to the idea.  The idea is for a glossary of terms that might similarly illustrate hidden biases in language, although part of the point here is that the hidden meanings are not actually that hard to find if you care to look into it a little bit.  In turn, this might provide a nice, gentle introduction to first-years to the importance of words, and to show them that semiotics is not mere semantics.  So, below, after this customarily long-winded and winding introduction, is a glossary of terms for a Tory history textbook.  The currently used tried-and-tested and widely-accepted terms are on the left.  On the right (see what I did there?—hurrr) are the new Tory terms, agreeable to the kinds of newspapers and the kinds of swivel-eyed, frothy-mouthed, hairy-nostrilled, Diana-obsessed saloon-bar-boors who do so much of Cameron’s work for him. Yes, some of these terms come from the New Labour era, but that of course should be no surprise to anyone and does not diminish for a minute their essential Toryism.

Medieval Society. Early Big Society.

Feudalism. Social responsibility.

Serfdom. Internship.

The Peasants’ Revolt. Class hatred (orig. David Starkey).

Enclosure. Land management efficiencies (necessary and unavoidable, no alternative, and not at all driven by ideology or class-interest).

The English/British/Civil War/s/Wars of the Three Kingdoms (etc.) / The Interregnum / The Glorious Revolution. The Unfortunate Disruptions.

The Renaissance/Enlightenment/Scientific Revolution. The Birth of Capitalism (orig. Niall Ferguson).

The Birth of Capitalism. The Enlightenment / The Great Going Forward (orig. Niall Ferguson). 

Imperialism/colonialism. Global democratisation (orig. Niall Ferguson).

Empire. Free-trade zone (orig. Niall Ferguson).

The Atlantic Slave Trade. African Labour Recruitment System THAT WAS ABOLISHED BY THE BRITISH (orig. Simon Schama).   

Slavery. Free Labour.

France. South Dorsetshire.

Germany. Unser Vaterland.

World War I. The Unfortunate Incident.

World War II. The Even More Unfortunate Incident.

Europe. Northern Africa.

The United States. Daddy.

Margaret Thatcher. Mummy.

Chartism. (See Peasants’ Revolt.)

Suffragettes. Lesbians.

The Working Classes. The Help (orig. Lucy Worsley).

The Welfare State. The Failed Soviet Union (orig. Dominic Sandbrook).

The National Health Service. The Sixty-Year Mistake. The Medical Business Opportunity.

The Tabloid Press. Our Friends in the North.

Bankers. Masters of the Universe.

 
You get the idea. Word.


Wednesday 12 September 2012

Defining Nick Clegg (or at least a kind of Cleggous phenomenology)

So Nick “Your Vote” Clegg has been at it again.  You know, lying, backsliding, you know the drill.  Longish ago, in the general election, his words about the NHS, tuition fees, and pretty much everything else that rolled off his slivery tongue have been contradicted by what he's since done. Back then, he was turned by the Bullingdon-wing of the Tory Party.  More recently, yesterday in fact, it turns out he originally intended in a speech to refer to opponents of equal marriage as bigots.  When found out, though, he said it was a “mistake” and that in fact he wouldn’t “insult” a spade by calling it a spade. To add a layer of irony to the otherwise simply twattish, this time he was turned by the traditional-(anti-gay)-marriage-defending-wing of a Church that was founded by a priapic syphilitic for the purpose of obtaining a divorce in order to produce a male heir to a throne he sat on to such tyrannical effect, and that was then re-founded by a “Virgin Queen” who never got married.  And there it is, Clegg-world in a nutshell, an absolutely batshit nutshell.  (Yes, I meant slivery.)

Of course, the original broken election promises are the worst of all the Cleggfucks we’ve been subjected to since that election.  In that campaign he made commitments that many people (myself included) voted for, and has since done the opposite of what those people gave him a mandate for.  He has thus done violence to democracy itself, as well as terrible damage to the people The Collaboration has committed itself to hurting to immeasurable but indubitably enormous effect.  (For a previous critique of a FibDem and an explanation of that term you can go to: http://stevesarson.blogspot.co.uk/2011/08/danny-alexander-and-pathetic-drivelling.html.)  And while dedicated Liberal Democrats (by which I mean people dedicated to the Liberal Democratic Party, but who clearly don’t give a Tom Tit about liberality or democracy) will continue to vote for whatever their leaders claim to believe in next time around, I don’t think any voters who voted for them on the basis of their purported principles will be doing so again. At least as long as Nick Your Vote Clegg and all his fellow collaborators are in charge (including Vince This is My Power Cable).  But it does of course raise once again the question of who is Nick Clegg?  Or, rather, who is Nick Clegg today?  Because, for sure, being “Nick Clegg” is, at any one time, an entirely temporary condition.  So the question is actually quite pointless, except in as much as it throws up opportunities for pointing and laughing, which, we can safely presume, will remain a legal activity for as long as Nick Clegg doesn’t promise that it will.

So, who is Nick Clegg?  Let’s try to define Clegg, but just, as I say, for a bit of a laugh.  Certainly, the kindest things you can say about him are that he’s open-minded, receptive to new ideas, flexible, and likes working with others.  These might well be suitable and indeed desirable qualities in the right circumstances, such as being a fictional or at least fictionalised character in a satirical sitcom that skewers the amorality and consequent hypocrisies of certain types of power-hungry politicians.  Although preferably not one written by Armando Iannucci, a satirist who now sells his wares to Sky, which perhaps finally unlocks the mystery of how it is that the characters he creates are such perfect incarnations of amorality and hypocrisy.  Sadly, however, Clegg’s qualities are entirely unsuited and indeed are or ought to be absolutely antithetical to being an actual party-leader in a real-world democracy.  Indeed, in these respects he’s about useful as Iannuccism is as a political philosophy, or even as a serious as well as piss-taking (i.e. properly satirical) critique of politics.  For more and much greater expertise on Iannuccian politics, in fact anti-politics, see: http://nottspolitics.org/2012/06/18/the-anti-politics-of-the-thick-of-it/

But, just because defining “Clegg” as Clegg is pointless, on account what we might also generously refer to as his mercuriality, doesn’t mean we can’t use the word “Clegg” as a prefix (or for that matter a suffix, but I’m going here with prefix) for all kinds of things that are Cleggacious; that is, things that reflect the characteristics of Nick Clegg.  Politicians’ names have long been used for their illustrative qualities beyond their immediate context but still in related contexts, sometimes in metaphorical and other allusory fashions.  Thus “Machiavellian” needn’t just narrowly refer to a Florentine renaissance philosopher’s expertise in high-order, low-down political chicanery, but can also refer more broadly to unscrupulous shenanigans in modern politics, office politics, business more widely, and in the administration of football associations. And just as “Stalinist” refers most directly to a ruthless dictator who murderously attempted to control all aspects of life in the old Soviet Union, it might also refer to a particularly odious ex- who tried to supervise every moment of your time, monitored your communications, and burned photographs of your previous partners.  It is in these latter spirits that I think we can define “Clegg” the word, if not Clegg the man, and try to identify an essence of Clegg, a Cleggness, perhaps even a kind of meta-Clegg, or a sort of phenomenology of the Cleggous.

Cleggot: one who claims to oppose bigotry and bullying until confronted by bigots and bullies, whereupon he or she (he) teams up with bigots and bullies.

Clegg (verb form): 1. To obtain votes by deception.  2. An inconvenient bodily motion associated with a particular kind of frightening incident, as in “The new boy promised to protect Tom and Scud from the beastly bullies, but then the fiendish Flashman appeared and, alas, the poor chap clegged his pants.”  3. The act of abandoning your erstwhile friends in an unprincipled and cowardly fashion.  As in, “The new boy promised to help Tom and Scud protect the tuck shop in which all the lads had invested together, but when Flashman and his friends came along he clegged-it as quickly as he could (although bearing in mind that he’d also clegged himself, his gait had a rather awkward prospect!), leaving his former friends overwhelmed and the greedy fiends free to steal everyone else’s goodies.”

Cleggottery: unlike a lottery, which has constant rules and in which everyone has an equal chance of winning and losing, a cleggottery has rules that change once the game is over, to the effect that all the winnings are then given to bankers and their political enablers, all ex-private schoolboys with vast amounts of inherited wealth originally generated by off-shore no-tax schemes.

Cleggervane: A sort of a-moral compass that at one moment follows the direction of the wind, but which, when manipulated by evil forces, turns at a Clegg-Angle.

Clegg-Angle: an angle of exactly 180-degrees, and, in fact, therefore, not actually an “angle” at all.

Cleggebrae: an internal apparently skeletal structure than can give an organism an appearance of considerable substance, but which is illusory, and so the organism will in fact quickly decay into insubstantiality. That is, what bananas are made of that makes them look all firm and yellow and lovely to begin with but then turns them into a shitty-looking mush of transcendental hideousness.   
     
Cleggeology: a specialist field within the discipline of geology that explores illusory and unstable terrestrial phenomena such as mirages, shifting ground, and quicksand.

Cleoggology: a kind of secular theology that involves the study of ineffable phenomena, self-contradicting pseudo-philosophies, and charming prophecies that will never be fulfilled.

Clegguistics: the ability to speak on all levels of untruth in many different languages.

Post-clegguistics: the inability to speak any kind of truth in any language.

Cleggygraph: a kind of polygraph that can see into the future and is thus equipped to unmask those who can lie so convincingly that only events that have not yet happened can uncover their mendacity.  Unfortunately, the cleggygraph is still only at a conceptual level of development, and so for now voters (and, if only vote-stealing was actually a crime, the police) must continue to make judgments based on Cleggsperience.

Cleggsperience: the fact of having been lied to on such a scale that you could never possibly believe the particular chubby-faced interlocutor again. It can lead to feelings of incleggulousness, to which only the cleggulous are immune.   

Incleggulous: a feeling of shock at a betrayal of such transparency and moral enormity that is beyond the relatively innocuous feeling of mere incredulousness.  As in, “Oh my God! I thought I was voting for a socially progressive political platform as expressed in the official Xxxxxxx-Xxxxxxxx Party Manifesto, but in government they are betraying their democratic promises and are actually dismantling the very fabric of the civil society I believe in and upon which all things decent and just must depend. I am absolutely incleggulous!”
    
Cleggulous: the ability to be fooled a second time by someone who has already proven themselves to be a most fantastical liar.  This ability represents such an extreme form of delusion that it is in fact a form of mental disability, although experts are divided on whether or not it actually constitutes an illness.  Nevertheless, the degree of intellectual debility required to be medically certifiable as cleggulous was illustrated by George W. Bush when he said, as he did in Nashville, Tennessee, on 17 September 2002,Fool me once, shame on, shame on you. Fool me -- you can’t get fooled again.”

Cleggology: an apology that is so breath-takingly ill-judged and of such staggering inadequacy that absolutely no one takes it in the least bit seriously and all everyone can do is laugh, mock, and make spoof youtube videos out of it.    
           

 

Tuesday 11 September 2012

A long tweet to Nick Robinson

Dear @bbcnickrobinson, I see you’ve taken to using the word “efficiencies” for government cuts, 1/10

@bbcnickrobinson so I would like to tell you why “efficiencies” is an inaccurate term. 2/10

@bbcnickrobinson My old computer, which my university can’t afford to replace right now, 3/10

@bbcnickrobinson has just crashed 6 times in an hour, costing me a great deal of time. 4/10

@bbcnickrobinson This is just one small example of 1000s every day in universities, 5/10

@bbcnickrobinson and millions across the public sector, of how cuts cause inefficiencies. 6/10

@bbcnickrobinson Of course no proper journalist would knowingly use inaccurate terms, 7/10

@bbcnickrobinson & so now I’ve explained to you how inaccurate this term is, 8/10

@bbcnickrobinson you no longer have any good reason for using it. 9/10

@bbcnickrobinson You’re welcome. 10/10     

Sunday 29 July 2012

The Olympic Opening Ceremony as Popular History and Public Enterprise; or, in your face, David Cameron

18 July 2020. Apparently, last night there was a repeat of the London 2012 Olympic opening ceremony, so I thought I'd revive this. It's sad, though, looking back on it. As the post makes clear, Danny Boyle's ceremony seems to celebrate the best of Britain--the parts that are proudly progressive, multicultural, and outward looking. It seems like a very different place today.   

2 January 2013. I wrote this post last summer and just add this note now in appreciation of Danny Boyle's decision to turn down a knighthood in the New Year's Honours because, as he put it, he is "proud to be an equal citizen." That of course is entirely consistent with the spirit of the Olympic ceremony that he directed. Something obviously lost on whoever decided to offer him this spurious form of recognition. Public life today is densely packed with venal people of no principle, although it has been in the past as well. But, whatever, how great it is to see a person of such talent show such adherence to principle. Yay to Danny Boyle. The man deserves a knighthood....

Back to July:      


Like many others, when I first heard what is now clear were the injudicious leaks that Danny Boyle’s Olympic Opening Ceremony was going to feature green fields, farms, sheep, chickens, and cricket, I indulged in the unpleasant cynicism that is the birthright of all freeborn Britons. While others made clever jokes on Twitter about large piles of burning mad cows, I made a lame effort about an entire nation pooing itself with embarrassment, my equally lame excuse for which is that I was indulging in the unpleasant scatology that is the birthright of all freeborn Britons. Watching the ceremony the other night, however, I found myself caught up in the magic of the show, entranced by the technical prowess, awed by the artistic awesomeness, seduced by the sentiments, and filled with patriotic pride. It was like I’d turned into an American, or something.

And even in the cold light of post-ceremonial reflection, I still feel the same. Sure, in a show that first featured a representation of British history, and then of modern British culture, that lasted a little less than 90 minutes and had to appeal not just to a partial but to a national and not just to a national but to a global audience, there was bound to be something for everyone but also, by the same token, something for everyone to complain about. First off, I was scrunch-faced with deep concern about the history and present state of the nation being represented largely though the medium of contempoweh darnce. Soon enough, however, the cynic in me was overwhelmed with admiration for the technical miracles achieved by the set people, as well as by the obvious artistic brilliance of the performers. Even the latter wasn’t too badly undermined by Kenneth Branagh’s adoption of the hamminess that is apparently obligatory for all Britain’s “great” actooors. On the other hand, to give Branagh the benefit of the doubt, Victorian cameras took a long time to take in enough light to make a picture, so photographic subjects had to stand still and maintain the same facial expression for a long time, and expressionless stiffness verging on sternness is the easiest apparent attitude to maintain for what must have seemed like forever for people for whom having their picture taken undoubtedly felt even more excruciatingly unnatural than it does for most of us today. So, maybe, contrary to pictorial evidence of an overly formal-looking stiff-neck, Isambard Kingdom Brunel in real life really was a nostril-flaring gurn-merchant with a theatrically shit-eating grin. 

Anyway, then I thought, well, what do I want here? Or, rather more to the point, what does the world want here? An actual history lecture? Of course not. Especially as media types today seem to believe there are only two historians suitable for big occasions. That would have meant either Starkey the Dinosaur boring on about Henry VIII, again, or, worse still, yet more nostril flaming hamminess but this time provided by the post-tumescent totem of post-imperial diminishment that is Niall Ferguson. So, really, I was pretty glad that it was an entertainer and not a historian who was paid to portray Britain on this occasion. And I’m glad too that that entertainer was Danny Boyle. We could after all have ended up with a qwhite tedious festival of twee by Richard Curtis, or, worse, of the ghastly petit-bourgeois snobbery and spite that for some inexplicable reason makes Mike Leigh so popular. But no, the working-class Lancastrian of Irish-Catholic origin who went to a Welsh University (Bangor) was the right choice to give us what we needed. 

But did he give us what we needed? I think he did. Sure, I’ve seen some people say what he failed to give us, and the historian in me knows it’s a true point if not necessarily a fair one, such as the police beating up strikers and suffragists. I myself was ultimately dashed in my hopes of seeing a gigantic Godzilla-figure in a blonde wig and a blue dress ferociously smashing down the industrial-era chimneys with an gigantic handbag. But Boyle’s mission no doubt was to come up with unifying themes we could all celebrate, given the occasion. And, given the occasion, given that consideration, he could have come up with something arse-achingly anodyne. Yet, in fact, given the occasion and necessary consideration, Boyle came up with something that struck me as highly thought-provoking and perhaps surprisingly subversive. 

First, those fields, farms, and cricket pitches were peopled by peasants rather than knights in shining armour doing their derring doo doo. Then, out of that weird hill thing at the end of the stadium, emerged industrial workers, hundreds of them, and Jarrow marchers, and suffragists, and migrants off the Windrush. Excellent—a people’s history! And on the latter point, the Windrush people for once weren’t represented as Britain’s first black people. There were people of all races in the pre-industrial part of the show too, quite rightly in a representation of a country that had a population of up to and perhaps over 20,000 non-white people in the eighteenth century. Yes, there were capitalists, the best of them represented by Brunel, if perhaps less so by Gurnagh, but the others looked a bit useless and shifty to me, waving their arms about and vaguely giving directions while the working people did the actual work of building Britain and making it what it is, right up to the hardcore steelmaking of the Olympic rings themselves. This was not unalloyed celebration, though. This section of the show was called “Pandemonium,” recalling Milton’s hell in Paradise Lost, raising the spectre of the suffering of early industrial and even some of today’s working people, and there was a Great War memorial moment, reminding us of the horror of industrialised armed conflict. But it’s also as if Boyle was saying that Thatcher might have unmade these industries, but these industries were made by these people, and so it was these people who made Britain what it is even to this very day. That Britain is about its makers, its workers, not its un-makers, its Thatchers. Prime Minister and vicious, right-wing Death-Moomin David Cameron is probably too dim to get it, but, nevertheless in your face, David Cameron.

Another in-your-face-David-Cameron moment came with the arrival of the dancing nurses and patients of Great Ormond Street Hospital. Apparently, Cameron gave out 17 tickets to the opening ceremony to “Big Society” volunteers. Seventeen. Danny Boyle gave us 600 dancing NHS staff and patients—trained, professional public servants and members of the public they serve. What a wonderful celebration of an institution that manifestly benefits all and that therefore our currently governing Eton and wannabe Eton millionaire elitists so transparently despise and want to destroy.  There are pictures of the dancers and of that hands-off-our-NHS slogan all over the internet now, daring Cameron and his bully boys to have a go. It was also interesting to see the nurses, doctors, and patients dressed in old-style uniforms and bed ware, and to see the beds with distinctly old-fashioned-looking metal frames. As if to say that this institution is part of our history, and, Cameron, if you make a speech tomorrow pretending to celebrate this celebration of British history, you’re celebrating the NHS. Of course, Cameron and his born-to-rule bully brigade will remain as arrogantly able as they always have been to pretend to respect the NHS while taking what action they can to destroy it, but Boyle has given them a warning and has given those of us who oppose them a new and gorgeous and globally celebrated totem of resistance. We will put it in your face, David Cameron. Reminder: Aneurin Bevan, architect of the NHS, described Tories as “lower than vermin.” He was correct.

The NHS, particularly in period costume, and still surviving at the point of writing, of course links the past and the present, the history and the present state of the nation. On the latter, the Boyle show was, for me, just as great and satisfying. People asked in the past how the British ceremony could top China’s, raising the rather dubious question of whether Olympic ceremonies are actually supposed to be as competitive as the sporting events themselves. Well, if one must, one can change the rules—Danny Boyle didn’t necessarily go for the faster, higher, stronger that a much larger country with a totalitarian regime’s control of people and resources can muster, but whereas the People’s Republic dubbed a child singer’s voice but replaced her in the stadium with a “prettier” example, Boyle had a choir of children with hearing-impairments and other disabilities singing God Save the Queen. Even the subject of the song herself managed to raise her facial register a notch or two above the “what the heck was thet?” expression that we know and love her for. Not that these ceremonies should be competitions for national supremacy, as opposed to celebrations of the great things all nations and peoples have. Boyle made the point himself when he said “The ceremony is very proud, but I hope in a modest way,” a point about a Best of Britain attitude that was somehow missed by those politicians and media people who bored on the following day about how Boyle showed how Britain is Best. We also had Emeli Sande doing a bonkers but beautiful version of Abide with Me, with her nose ring.  And we had Millie Small, the Beatles, the Kinks, the Stones, The Who, Pink Floyd, Queen, the Sex Pistols, the Clash, the Jam, The Specials, Frankie Goes to Hollywood, the Arctic Monkeys, and Dizzee Rascal, as well as, quite rightly of course, a bit of Shakespeare, Blake, and Elgar. And he gave us a bit of a laugh at ourselves too (never a strong feature of totalitarians or Tories), with James Bond (and not just Daniel Craig but also National Treasure David Beckham speed-boating up the Thames, Bond-fashion), with “the Queen” parachuting into the stadium, and with Mr. Bean among the London Symphony Orchestra and then sending up Chariots of Fire (and, with his beach-running shenanigans, perhaps sending up the very silly British or at least English sense that we have a unique “sense of fair play”—and what better place to laugh at our darker flipside, the implication that Johnny Foreigner is a bladdy cheater, than the kick-off of the Olympic Games?). Happily, and importantly, and tellingly, Danny Boyle did not wheel out Cliff Richard. Nor did we have Elton John whoring his once-great homage to Marilyn Monroe into a brown-nosing travesty of a tribute to spoilt royalty, or any other such buttock-clench-inducing betrayal of the point of post-rock-and-roll popular music. We also had celebrations of British film and TV and of the people who watch them, and of modern communications, all ultimately personalised in the form of the internet-originating romance of Frankie and June, interestingly old-fashioned names for two very modern-looking mixed-race young people—all symbolising British people who are happily connected and united across generations and races, with no false talk of “dividing lines of...”, and scenes with sentiments far, far, far from the fear-mongering, divisive, hateful, and actually anti-patriotic cant of the Daily Mail and of David Cameron’s “Broken Britain.” 

And indeed no cynicism about the internet, portrayed here as a phenomenon that can and does bring us together far more than it divides us. Danny Boyle even had Tim Berners-Lee in the show, and described him as “the scientist who invented the World Wide Web, and even more important than that he put it in trust, made no personal gain,” leaving it “free to us all.” How different again from the Cameronian cynicism that says that the best can only be produced via motives of profit (unless it’s care of the elderly, disabled, and otherwise disadvantaged, in which the work can be done by unpaid “Big Society volunteers” with random levels of training, competence, and commitment). And that brings us back to the politics of the ceremony. Asked by the Daily Telegraph whether his ceremony was “overtly political,” Danny Boyle denied that it was overt. “The sensibility of the show is very personal,” he said “.... we had no agenda other than values that we think are true.” It seemed to me pretty clear from its contents what the show’s true values were. It was also pretty clear what the message of the method of the show’s creation is or certainly ought to be. That is, isn’t it great what Danny Boyle and his dedicated team of not-for-profit creatives, organisers, and artists could do with the investment of £27 million of public money (a sum, let us never forget, that would barely fill one board of bankers’ bonus bags with taxpayers’ bailout cash)? They certainly did a better job than the go-getter entrepreneurs of G4SClub7, the for-profit “security” firm that couldn’t spot a terrorist threat if a large man with an eye patch and hook-hands walked past them carrying a black spherical object with the word “BOMB” written on it. Thankfully, we have the (still) state-funded, non-profit-making dedicated professionals of the police force and the army to bail us out yet again from the cut-price incompetence of money-minded racketeers.  In your face, David Cameron, in your shiny, hammy, wobbly stupid face.