Wednesday 14 November 2012

The Marketisation of Higher Education: A lesson from the seminar room

Update: The Times Higher Education asked me to do a shorter version of this post for their comments section.  It is here: http://www.timeshighereducation.co.uk/comment/opinion/employability-agenda-isnt-working/2002639.article#.UUrF79iMf6g.twitter

The original post....

The other day, Howard Hotson, an Oxford history professor and steering committee chair of the Council for the Defence of British Universities, wrote in the Guardian about the dangers of marketisation in Higher Education. http://www.guardian.co.uk/commentisfree/2012/nov/11/universities-great-risk-we-must-defend-them?fb=native&CMP=FBCNETTXT9038

Professor Hotson's particular point here was about business management models being applied in academe, but as he and others, including the Council for the Defence of British Universities, have said, marketisation represents a more general threat to“fundamental academic principles and the purpose of higher education itself.”  Indeed I had an experience this very morning of how that threat works where it matters even more than in management—in the classroom.  My department does a first-year course called Making History that aims first and foremost to teach students about the complexities of researching and writing history. We also use it to teach transferable skills such as how to write and structure essays, how to present written work, how to reference properly, and so on, at this handily early stage in their student careers.  And as of this year we are also using it as a means to integrate “employability” into our curriculum. My colleagues and I co-teach much of this course in seminars, and today’s seminar was devoted to CVs. 

On the face of it, there may seem nothing wrong with any of this. Teaching the problems of historical analysis is so obviously a part of our remit as history teachers as to require no further comment. Teaching how to write, how to communicate historical and indeed any kind of knowledge, is similarly unquestionably good. So is turning out people with degrees who are equipped for the job market. Indeed, through my 20 years as an academic I’ve always taken my responsibilities in this regard very seriously and worked hard to prepare students to be able to make their livings, as well as teaching them the critical thinking skills that are an essential part of undergraduate education. There is not, needn’t be, and shouldn’t be a zero-sum game or a polarised debate about whether my role and that of academic teachers is about training students in critical thinking and/or (should be and) about training students in the knowledge and skills they need to find a place in and be useful in the world of work after they graduate. We can and should do both. So what am I worried about?      

I’m worried because even though there need be no zero-sum game between the aforementioned ends of a university education, and indeed one might argue that the best employees as well as the best citizens are ones with free and critical minds, problems nevertheless arise when the proponents of one agenda attempt to diminish the other. And this morning’s class was for me in microcosmic form an example of how the “employability” agenda is diminishing the imperative of teaching critical thinking. Here’s how. 

“Employability” is now indeed explicitly an agenda, and it is an agenda driven in very particular ways by very specific interest groups. Peter Mandelson, the high priest of New Labour neo-liberalism, moved responsibility for Universities from the Department of Education to the Department of Business, Innovation, and Skills, where it remains under the charge of Vince Cable, a man who once preached against economic neo-liberalism but whose words turned out to be about as reliable as a Liberal Democrat manifesto pledge. Our masters at the Department of Business, Innovation, and Skills have since made it no secret that they want HE to focus more on employability, on preparing students for the world of work. This kind of thing used to be the task of the Careers Office in combination with academics in their capacities as Personal Tutors. There it remains, though that aspect of Personal Tutoring is now called Personal Development; a clear linguistic signal about who is setting the agenda and what kind of agenda they’re setting. But the newest development in this clearly creeping process is that in parts of the students’ timetables where we used to teach them history (and you an insert of any other discipline here to the very same effect), we are now expected teach things like CVs.  Employability at my university has now therefore crossed the boundary from extra-curricular to curricular activity. It may have done so already at other institutions; it will no doubt do so later at yet more. To ensure that we do this, by the way, “employability,” measured in terms of what proportion of a university’s students are in full-time employment six months after graduation, features highly in university league tables—another weapon taken from the bag of sticks used to beat universities into submission to the ethics and practices of the market. University managers feel they have no choice but to do the bidding of business and government or else go “out of business.”        

Is this really that bad? you may be asking. It’s just one seminar out of ten; surely not too much time to take from the academic syllabus for the sake of employability. My first answer is that, okay, it’s one of ten seminars—for now. Far more revealing, however, is what I was required to do in this one class. As noted above, the employability agenda in general is being driven by business interests operating though the government which pressures university managers. The origins and intentions of the agenda, though, very much show though in the way “employability” is required to be presented in the classroom. For each seminar each tutor on the Making History (now Making History History?) course is presented with a “task” for him or her and the students to do that week. Here, in full, is the one I was given for this morning’s seminar (my co-teaching colleagues got the same one).

“Seminar 7

For this task you need to prepare three things:

  • a CV
  • a paragraph identifying its weaknesses
  • an action plan for how you are going to address these weaknesses
Guidance on all this can be found in the employability section of the Information for History and Students site on Blackboard.”

As this inidcates, the task is not really just about the perfectly laudable aim of helping students prepare to find jobs.  It’s about much, much more than that.  It's about turning them into cogs in the corporate machine. Let’s go through each of the things the students had to prepare, one-by-one, to see more fully what they’re about.

First, the students were required to present their CVs in front of each other, inviting them to think of each other as competitors. Of course one day they will be competitors for individual jobs in the market place. But it is not necessary to make students think of themselves as employment rivals in the first semester of their first year in university, except perhaps if the aim is to inure them to the notion that they are competitors and to normalise that perception of themselves and of the world, to make the rat race seem the right and only way for the human race. 

Next, as above, the students were asked to identify their weaknesses as potential employees. Again, they’re first years in a history class. Why do that this early? Here's why.  Another piece of modern corporate cant is that workers need to be flexible and adaptable to the needs of business--as opposed to businesses being flexible towards the needs of individuals, communities, countries (paying their taxes?), the environment, etc.  Once again, then, it’s about inuring students early to the first imperative, making them think that they have weaknesses, they are the problem, they need to adapt, the underlying assumption of all of which is that they serve the world of the business more than the world of business serves them and the rest of us.  

The third thing they had to prepare would have the effect of practical implementation of that notion: that they need to serve business and business does not need to adapt or serve them. Not just ideological co-option then, but the beginnings of nbehavioural co-option, not just indoctrination, but actual preparation for collaboration.  It had the added intent perhaps of innoculating them to the language of the modern boardroom, of “action plans” in this instance. How's that for "thinking outside the box"? 

The effect of the whole, especially if given sanction by us as a History and Classics Department when we include it in our syllabi and on our information sites, which, as above, it emphatically is, is to promote the notion that all this is alright, a natural order of things, or at least that it's inevitable and something that students must accept. Not something they should question with critical, thinking minds. Certainly not something they should complain or protest about.  
 
As it was, I chose as far as I was able not to let the above happen to its full potential effect. Instead, first, I felt it was unconscionable to require students to present their CVs in front of other students. True, CVs are essentially public documents that are handed over eventually to individuals or committees hiring workers. But I felt it was wrong to make them hand these documents over for perusal in a classroom.  And not just because it would encourage students to think of each other as economic competitors at a time in their lives when they should be thinking of each other as members of an academic learning community.  But also because some might feel uncomfortable or even perhaps humiliated by the process of comparing their records with that of others in this way, and wholly unnecessarily so at this early stage of what we still sometimes call their academic careers—especially when they’re asked to identify and establish “action plans” to deal with their “weaknesses.” They should not feel or be made to feel either uncomfortable or humiliated, and they wouldn’t be if we gave them the message that right now they’re students and shouldn’t feel obligated to be job-market-ready just yet, and that it’s okay at this stage in this stage on their lives if they have other interests and priorities. But we’re not giving them that message. By allowing the “employability” agenda directly into the classroom in the form of presenting CVs, making them dwell on and fix their "weaknesses," their unfitness for purpose, we are telling them instead that their worthiness as students depends on their job-readiness and on their acceptance of what they are told that requires. So, rather than force that on them, I gave my students today a choice of showing their CVs, identifying their weaknesses, and making their action plans--or not doing so. I felt I had to let some do it if they wanted to in order to maintain equal opportunity with those students in other classes who were doing it.  But I gave them the choice not to do it.

I also decided I wouldn’t let the classroom time go by without asking them to discuss the above issues, allowing them the opportunity to do some critical thinking, to reflect on their learning and take a measure of ownership of it—concepts that are not as in vogue as they were before business interests began determining educational priorities more directly.  In that spirit I assured them, as I always do, that they must make their own minds up and argue for themselves—not follow my lead.  I should no more try to indoctrinate them than business interests and governments should. The subsequent discussion revealed a wide range of opinion (though no reductive polarisation)—a very healthy thing among intelligent, independent-minded young people, who, I was thereby given encouragement to believe, will not be easily brainwashed by anyone.

A final point. Another aspect of market-based thinking that’s always thrown at us in support of the marketisation of Higher Education is the cant of consumerism.  And the supposition is that, as "consumers of the higher education experience" (Peter Mandelson), students' top priority is job training.  This is a false proposition, first because it presupposes that students are first and foremost consumers, although that's a compelling proposition when you’ve laid the groundwork for it by charging students up to £9,000 a year for a university education.  It’s also a weird one, though, when we’re so often presented with the notion that there’s no alternative to the marketisation of Higher Education and indeed of just about everything else—that is, there is no choice but "choice". And it's demonstrably false for another reason. It was interesting that this morning five out ten students opted not to engage in the CV task, and ten out of ten thought that employability education and advice should be left to trained experts in dedicated careers offices and should have no place in a history course. Another five who are registered in the group chose not to turn up for the CV task at all. I’m required to report them for their absences. But I’m choosing not to.



Monday 5 November 2012

Remember, remember the fifth of November; or, understanding some aspects of the historical origins of the US Presidency; or, why Americans are roundheads

Newly reposted and slightly revised for 5 November 2014, as many US voters once again proved once again how very different they are.

This blog post was originally intended as a serious-ish discussion of aspects of British and American constitutionalism that help explain some aspects of the current US Presidential election, aspects deriving from the Glorious Revolution of 1688. The tenuous-ish link to November 5 has to do with William of Orange landing at Torbay on that day, 324 years ago. As is my way, though, I got diverted for several hundred words into a catastrophically meandering preamble replete with the usual desperate, attention-seeking lengthy humorous asides, or non-humorous asides, depending on suxh things as your point of view, and age. So, if you want to skip all that and get to the serious bit, scroll down until you see the words, in bold, This is the serious bit.  
As children, my fellow undergrown gumps and I were often told by our teachers and our televisions to “remember, remember the fifth of November.”  Sometimes these authorities invoked this phrase as parts of health and safety lectures about not holding lit fireworks and thereby ending up with a blackened lump of burning bone and smouldering skin where your hand used to be, although even back then I thought that wouldn’t be such a bad thing in some cases, for the greater good, you know, the Darwinian betterment of the human race and all that.  This may be harsh, but it is an important consideration.  Imagine if lizards made an evolutionary comeback. The two-legged stand-up ones (who were always the dangerous ones, compared to those fat, four-legged, leaf-eating dumbosaurs) would have major evolutionary advantages over us, accustomed as they would already be to performing daily tasks with naturally short arms and stumpy paws. That’s where namby-pamby health and safety will get us once global warming returns us to the most elemental struggle for survival—defeated and eaten by dinosaurs. 
Anyway, I digress.  We were also taught that the fifth of November is about Guy Fawkes’s attempt to blow up the Houses of Parliament in 1605. “Remember, remember the fifth of November” comes from the following, or some variant of it:      
Remember, remember!
The fifth of November,
The Gunpowder treason and plot;
I know of no reason
Why the Gunpowder treason
Should ever be forgot!
Guy Fawkes and his companions
Did the scheme contrive,
To blow the King and Parliament
All up alive.
Threescore barrels, laid below,
To prove old England's overthrow.
But, by God's providence, him they catch....
Fair enough, an important historical moment to be sure, and it should be remembered.  But it shouldn’t and doesn’t have to be and often isn’t remembered to the exclusion of other fifths of November.  We’ve already seen that our future Giant Lizard Kings will remember the fifth of November as a crucial factor in their eventual ascendancy over us.  But there are other fifths of November to remember too. One that used to be universally remembered in the Anglo-dominated world is the fifth of November 1688, the day that William of Orange landed with his forces at Torbay, in Devon, on his way to overthrow James II and claim the throne for himself as King William III and his wife as Queen Mary II.
Now this event, the Glorious Revolution as it came to be called, can, of course, be remembered in different ways for different things. The English might remember it for its restoration of Parliamentary and other liberties from the prerogative-grabbing claws of William’s father-in-law James, the father of Mary, a legacy perhaps best encapsulated in the Declaration and then the Bill of Rights of 1689.  It often isn’t remembered this way, however, or indeed at all, as I pointed out somewhat shirtily in a previous post about a radio programme positing that Britain should perhaps adopt a Bill of Rights, perhaps based on the US one (we already have one, from 1689, and the US one is partly based on it):  http://stevesarson.blogspot.co.uk/2011/10/britain-already-has-bill-of-rights-yes.html
Alternatively, and equally validly, Catholics might remember the fifth of November 1688 and what followed it as the beginning of a century and a half of political and economic disfranchisement and social segregation. The Scottish might and perhaps more often do remember it as a moment in the aggrandisement of Parliamentary authority that led to the kind of imperialism that saw their parliament abolished in 1707, not to be restored until almost three centuries later. The Irish might and certainly far more often do remember it for James's final defeat by William at the 1690 Battle of the Boyne, and what followed—another moment in the on-going oppression of the Irish and of Catholics. Patriotic English people used to remember it this way too, only in a rather more celebratory way, and indeed the fifth of November often used to be called Pope’s Day, and people would burn effigies of the pontiff rather than of Guy Fawkes. The Protestant Irish still see it in something like these terms, and to this day they march about in traditional seventeenth-century costumes, such as bowler hats of the exact same kind that King William and Queen Mary used to wear.
Whatever the case, for all the Glorious Revolution's ambiguities as an event, I think it’s a shame that it isn’t as well remembered as a moment in the history of liberty as well as other things as it might be. It certainly used to be. When the American lawyer and legislator John Dickinson wrote his famous (and deviously misnamed) Letters from a Farmer in Pennsylvania, objecting to British taxes in the form of the 1767 Townshend Duties, and to Parliamentary presumption of an entitlement to rule over the American colonies, as expressed in those taxes and in the Declaratory Act of the previous year that proclaimed Parliament’s right to legislate for the colonies “in all cases whatsoever,” he published the first of the letters on the fifth of November. And everyone knew what that meant—Dickinson was performing a symbolic insurgency in favour of liberty analogous to the invasion of England by William of Orange, the cheeky monkey. 
This is the serious bit.
[Fellow historians and, for that matter, political scientists: please see * below for a quite possibly unnecessary and vainglorious note on intellectual property.]
So, anyway, why, given all the different ways to remember the fifth of November, am I writing about it here today? It’s because, as it happens, this year, the fifth of November falls on the day before the day of the US Presidential election. What possible bearing can the fifth of November 1688 have on the 2012 contest for the occupancy of the most powerful address on earth, 1600 Pennsylvania Avenue, the White House? Well, quite a lot, actually, and here’s why.
As many historians of colonial America have said, in some ways the Glorious Revolution in the colonies was similar to the one in England. Yet in some crucial respects, its settlement worked out very differently on either side of the Atlantic Ocean. In the run-up to 1688, colonists felt that their rights as “freeborn Englishmen” were being violated by James II or his minions much as Englishmen back home felt theirs were. The need for actual revolutions in most colonies, though, was obviated when colonial governors declared allegiance to Parliament and to William and Mary once news reached America of James’s overthrow. Two of the three exceptions prove the rule. In the colonies of Massachusetts and New York, local revolutions overthrew Governor Edmund Andros in Boston and Lieutenant-Governor Francis Nicholson in New York city, who were agents of James II’s direct rule of the Dominion of New England. When that that regime collapsed in those places, it also collapsed in the other Dominion colonies of Connecticut, New Hampshire, Plymouth, Rhode Island, and New Jersey. Things were slightly more complicated in Maryland, a proprietary colony owned by the Lords Baltimore, the Calverts, who were Catholics, bringing a religious dimension to their overthrow in early 1689 by a Protestant Association, making that local revolution even more resonant of events back in England. (The Calverts got their proprietary back when Benedict Leonard Calvert, the fourth Baron Baltimore, converted to Anglicanism in 1715.)  
As the above indicates, there were local contingencies that made the Glorious Revolution, or revolutions, in America sometimes somewhat different from what transpired in England, but it’s how the Glorious Revolution’s settlement panned out over the long term that really explains some of the differences in British and American politics since, including the nature of the executive arms of government and the relationships of executives to legislatures on this side of the Atlantic and that.  
In England, desperate as politicians were to be rid of the Papist tyrant James, as they saw him, and to restore what they saw as England’s ancient constitution, they were also desperate to avoid falling back into the darks days of the 1640s, back into civil war in which one-in-ten of the population died, and into God-only-knew what kind of cosmic retribution another regicide might invite.  Men as diverse as radical Whig believers in popular sovereignty, moderate Whig and Tory believers in Parliamentary sovereignty, and High Tory believers in the Divine Right of Kings therefore worked together on a compromise that all could peaceably live with. They thus agreed that James had “abdicated” rather than been forcibly overthrown. That allowed them to believe that the Declaration and Bill of Rights represented the work of the people, or of Parliament, or of God, as ideological preference directed. Eventually, they could all believe in the “principle of co-ordination” whereby sovereignty, whether it originated in the people, Parliament, or the crown, in practical terms worked through the operations of the crown-in-parliament. By the 1720s, the time of the premiership of Robert Walpole, the British parliamentary system worked much as it does today. The leader of the majority party in the House of Commons becomes the King’s or Queen’s First or Prime Minister, the head of an executive that in the nature of the system controls the majority in the legislature. (Or, in the event of a hung Parliament, the leader of the party with the largest minority can form a coalition with the leader and members of a third party to form a working majority in the House of Commons, especially if members of the third party are a sufficiently unprincipled bunch of shit-weasels who, for the sake of experiencing chimerical power, are willing to betray every principle they promised during the foregoing election to uphold.)
In America, however, or, rather, in the various Americas, it didn’t turn out this way at all. Royal governors were just that: royal governors who derived their authority from the imperial centre, whether formally from the Privy Council and thus the crown, or in practice from government ministries, or indeed from the complicated but co-ordinated operations of the crown-in-parliament. Governors could not co-ordinate with provincial legislatures in the way the crown could and did with Parliament in London, as that would have put them at odds with their metropolitan masters. Even when they did what local legislative assemblies told them to do, as indeed many had little choice but to do, they could not institutionalise that kind of executive-legislative co-ordination, and ended up politically ineffective from an imperial-interest point of view. When they were active, the implementation of imperial-executive will necessitated such actions as proroguing and even dissolving colonial legislative assemblies, vetoing their legislation, and exercising other traditional executive actions such as creating courts and dismissing politically disagreeable judges—the kinds of prerogative powers that were either explicitly outlawed by the Bill of Rights or else obviated by the operations of co-ordination within England (or, after the 1707 Act of Union, Britain). It was this kind of executive prerogative that Parliament adopted towards the colonies from the passing of the Sugar Act in 1764, ultimately leading to American independence in 1776, which the British were forced to recognise in the Treaty of Paris of 1783. 
The Americans thus never developed the tradition of co-ordination that defines the British system of parliamentary democracy. They thus to this day retain a degree of separated rather than mixed powers in their constitution. Thus it is that the American President is elected every four years singly and separately from members of Congress who are elected every two years in the case of Congressmen and Congresswomen (members of the House of Representatives) and every six years in the case of Senators. Thus it is that there can be and often is a President of one political party and either a House of Representative or a Senate or indeed a whole Congress dominated by another. Far from controlling the legislature, as is inherently the case in the British system, the American executive often finds himself the opposition to it.  
It’s often said that the US constitution is new, or at least much newer than the ancient constitution of Great Britain. I’d contend that at least in some respects the opposite is the case. The modern British constitution is very much the product of 1688-89, a 1688-89 that never happened in America. The US Constitution reflects (and so do all the state constitutions with their independent governors) an older oppositionism between executive and legislature that pre-dates the principle of co-ordination that emerged from the Glorious Revolution in England, and that indeed was such a divisive feature of the English civil war era and in fact much further back into the English past. And what’s true of American political constitutionalism may also consequently be true of American political culture. If the moderation of modern British politics is a product of the co-ordinated nature of the British parliamentary system (as opposed to being somehow inherent in British character, as some ridiculous people would have you believe), then perhaps the sometimes “paranoid style” of American politics (as it has been termed by American political historians) is a product of an inherited roundhead tradition of legislative opposition to the ever-present danger of tyranny that they believe executive power inherently represents. 
So, what I’m saying is, if you think American politics is a bit weird and wacky, remember, remember the fifth of November (1688).     

[* Academic friends: I'm intending to develop and publish these ideas one day, so if for any reason you are enough of a poppet to consider them worthy of mention, then I'd appreciate a citation. Ta.]