Governmentality & commodification, the keys to Yanqui academic hierarchy

by Toby Miller

tobymiller.jpgTobi Miller, professor of English, Sociology, Women’s Studies at the University of California Riverside talks about the consequences of commodification of higher education in the US, which applies to universities across the world as they negotiate the pressures of neoliberal economics.

Eighteenth-century European Enlightenment knowledge invented social collectives and liberal individuals. Since that time, populations have been understood through statistics and policy interventions-the social body assayed and treated for its insufficiencies. Governing people came to mean, most critically, combining science and government to maximize civic management and economic productivity. Such developments coincided with and cross-pollinated economic transformations that forged industrial and finance capitalism.

In this brief piece, I aim to explain how the history of US universities is characterized by an expansion of governmentality, in the sense of research undertaken for the public weal and teaching that reaches into the lives of the populace to train it in self-regulation; and an expansion of commodification, as research becomes animated more and more by corporate needs, students are increasingly addressed as consumers of education, and paymasters and administrators accrete authority over academics. Both tendencies increase hierarchization.

Many writers working within the governmentality tradition do so in a way that assumes incommensurability with Marxist critique. I see no logical reason for this. I acknowledge that the project of neoliberal governing-at-a-distance has its own logics and materialities; they fit the agenda and methods of corporatization as much as governmentality. I argue that both tendencies have been at play since the emergence of higher education as part of public culture in the US 150 years ago, but that neoliberalism has maximized their influence in recent times.

The classic US model of higher education aims to equip students with a liberal inclination that respects knowledge of a topic and for a purpose, rather than simply knowledge by a particular person. The model places its faith in a discourse of professionalism rather than charisma. It urges people to believe in and exchange openly available knowledge, not secret magic. In other words, if someone truly wants to know how television works, she is permitted access to this intelligence. But she may equally subscribe to digital cable simply based on her confidence in the system of governmental and university research, industrial training, and accreditation that impels and regulates this fraction of a culture industry. She need not do so based on the idea of audiovisual communication as a gift from a deity to an elect whose knowledge and power cannot be attained by others. Of course, liberalism also uses the concept of human capital-that there should be a mutual investment of time, money, and training by both society and subject to create a corps of able-minded technical employees and willing patriots who are taught by a docile professoriate-the idea of higher education as an industry, and students as investors. Hence Bruce Johnstone, a former Chancellor of the State University of New York, offers the concept of ‘learning productivity’ as part of students beginning to ‘assume greater personal responsibility for their learning.’ How did this state of affairs come to pass?

Since the 1830s, when the first waves of white-settler European immigration across classes began, US higher education has generated practices and knowledges for use by the state and business and to integrate the population. By the 1850s, with the country rapidly industrializing, new chiefs of industry envisaged partnerships with tertiary education to develop a skilled workforce. Abraham Lincoln’s Republican Party enabled this alliance via the land-grant system. Technocratic from the first, it flowered at the turn of the century, when corporations were placing more and more faith in applied science via electromagnetism, geology, chemistry, and electricity. By the twenties, Harvard had its business school, New York University its Macy’s-endorsed retail school, and Cornell its hotel school. No wonder, then, that Thorstein Veblen referred to US universities as ‘competitors for traffic in merchantable instruction.’ His words remain accurate in their diagnosis (even if their style looks old-fashioned). The two World Wars provided additional pump priming and premia on practicality from the Federal Government, and the big research schools actually expanded their capacity during the Depression. Today, a financial dependence on private sources is twinned with what we might call the mimetic managerial fallacy, a process whereby both governments and university administrators construct corporate life as their desired other. This not only makes for untimely influences on the direction of research and teaching, but on the very administration of universities, which are increasingly prone to puerile managerial warlockcraft superstitions about ‘excellence’ and ‘quality control.’ Academic institutions have come to resemble the entities they now serve-colleges have been transformed into big businesses. Major research schools, particularly private ones, are also landlords, tax havens, and research-and-development surrogates, with administrators and fundraisers lauding it over Faculty. Decanal apparatchiks have essentially replaced Faculty governance. College bureaucrats are making a transition to full chief-executive-officer stature.

The mimetic managerial fallacy also leads to more and more forms of surveillance from outside. Regional accrediting institutions vouching for the quality of US degrees have been in place for well over a century. But since the 1970s, we have seen ever-increasing performance-based evaluations of teaching conducted at the departmental and Decanal level, rather than in terms of the standard of an overall school. Today, such methods are used by 95% of departments. These systems directly link budgets to outcomes, in keeping with the prevailing beliefs of public-policy mandarins-their restless quest to conduct themselves like corporate elves manqués. As successive superstitions came along-the 1990s variety was Total Quality Management-administrators fell in line with these beguiling doxa. Along the way, Faculty-student ratios worsened, and reporting, surveillance, and administration grew in size and power. Many of us who have actually worked for business and government know what laughably inefficient institutions they can be-but then, those who watch academics do research and teaching from the perch of administration frequently have ressentiment in their eyes and underachievement on their résumés.

In the research domain, the notion of mutual interest licenses partnerships between state, college, and industry, dating back to 19th-century museums, observatories, and agricultural-experimentation outposts. The shop was really set up in the late 1950s. The Cold War stimulated growth, increasing federal and state subsidies. Considerable effort since then has gone into clarifying the significance of tailoring research priorities to governments and corporations. Consider linguistics (the scandal of language-spread policy); political science (Project Camelot in the 1960s); economics (Robert Triffin acting as plenipotentiary for the US to the European Economic Community and then as a European delegate to the International Monetary Fund, just a few months apart, in the 1980s); sociobiology (defenses of male sexual violence); and psychology (participating in torture during the latest War on Islam). The very existence of communication research raises questions of ideological distortion, given the discipline’s formation under the sign of war and clandestine state activity and later corporate and foundation support. The same could be said of the policy sciences. Originally conceived as points of connection between democratic and executive action, they have degenerated into expertise that lacks articulation with everyday people, connoting pro-corporate/pro-Christian positions that turn highly contestable positions into absolutes, with consultant professors simultaneously performing objectivity and applicability.

This history predates contemporary concerns about how to finance US research universities since the system lost relatively disinterested Cold-War stimuli to big science in the early ’90s. Today, it appears as though governmentalization and commodification have merged in their concerns and methods. Congress provides more than a billion dollars in direct grants to universities, apart from the peer-reviewed funds available through the National Science Foundation and the National Institutes of Health. But whereas corporations gave US schools about US$850 million in 1985, the figure was US$4.25 billion a decade later. The NSF established dozens of engineering research centers in the 1980s with the expectation of “partnerships” flowering between corporations and higher education. Such centers have effectively functioned as ongoing public welfare for “entrepreneurs.” Industrial research parks now dominate the work of such schools as Texas, Massachusetts, Duke, North Carolina, and Stanford. And MIT’s media laboratory is a play-pen provided by corporations for well-meaning but apolitical graduate students working with implicit and explicit theories of possessive individualism-an ethos of fun in which the latter may privately claim to be subverting their paymasters, but where they do so in ways that are eerily reminiscent of the dot-com boom’s empty cybertarianism.

The extraordinary Bayh-Dole Act of 1980 permits non-profit educational institutions to own and commercialize inventions, provided that the state can use them as it sees fit. Prior to the Act, research schools collectively accounted for about 250 patents a year. Now the figure is close to 5000. Perhaps 3000 new companies have emerged as a consequence of the legislation. It should come as no surprise that US universities are increasingly business-like entities, at times taking legal action against their own researchers to make as much money as possible. The idea of working in the public interest has been erased through amendments to state laws throughout the country that have quietly exempted publicly-funded scientists from conflict-of-interest responsibilities that apply to refuse workers and personnel officers.

Medical drugs are a case in point. US deregulation has propelled marketing into the forefront of drug development, and pharmaceutical corporations (pharmacorps) deem old-school academic research and education too slow for their financial rhythms. Recent evidence suggests that marketing as much as medicine determines how to develop a new chemical compound once it has been uncovered: whether it will be announced as a counter to depression or ejaculation; whether it will be promoted in journal x or y; and which scholars will be chosen to front it and produce consensus about its benefits. Leading figures in medical schools and professional practice routinely accept monetary and travel gifts from companies as a quiet quid pro quo for favorable publicity of this kind. Pharmacorps budgets for marketing to clinicians have skyrocketed, and they pressure medical journals to print favorable research findings in return for lucrative advertising copy. Major advertising agencies that work with pharmaceutical comanies, such as Interpublic, WPP, and Omnicom, have subsidiaries like Scirex that even conduct clinical trials. Known as medical education and communications companies, they brag about ‘getting closer to the test tube.’

The desire for sales and speed versus the need to observe protocol meet, ironically, in scholarly journals, which the giant pharmaceutical multinational Pfizer describes-rather alarmingly-as a means ‘to support, directly or indirectly, the marketing of our product.’ Little wonder, then, that medical education and communications companies provide ghostwriting services, paid for by corporations, that deliver copy to academics and clinicians-and pay them for signing it. One in ten articles in the leading US medical outlets are today estimated to be the work of ghosts, and 90% of articles about pharmaceuticals published in the Journal of the American Medical Association derive from people paid by pharmacorps. Faculty are shilling for corporations by allowing their names to go on articles that they have neither researched nor written-for all the world like footballers or swimmers who have never even read, let alone penned, their ‘autobiographies.’ Instead, these corporate subsidiaries write the papers on behalf of academics.

The prevalence of ghostwriting has led the International Committee of Medical Journal Editors to establish criteria that require authorship attribution to verify who undertakes the research and writing that go into manuscripts. It’s good to see that editors of the leading medical journals are speaking out against these dubious practices. But next time you are perusing a CV that includes endless four-page articles signed by 27 people allegedly working together on pharmaceuticals in a laboratory, the field, or clinical trials, you might want to ask whether the real ‘author’ was even listed. And you might begin to query the assumption that the sciences and medicine are at the heart of scholarly rigor. When Barthes wrote of the ‘death of the author,’ and Foucault described writers as ‘author functions,’ their ideas were belittled by many. But using such insights, perhaps it is time to name and shame the ghostly figures who produce so much ‘scholarly’ literature, and expose the farcical faculty who function as the public face of this deceit-perched atop research schools.

Turning away from research, we can see a tendency across the entire degree-granting sector of transferring the cost of running schools away from governments and towards students, who are regarded more and more as consumers who must manage their own lives, and invest in their own human capital. In 1980-81, the three levels of government accounted for 48.3% of funding, whereas the proportion was 38% in 1995-96. This trend towards reliance on tuition doubled student debt between 1992 and 2000. One thing is common across US higher education-the crisis of student debt in an era when tertiary studies are financed more and more at personal cost. For a decade and a half, tuition increases have outstripped inflation, rocketing beyond stagnant levels of Federal aid to students. As a consequence, corporate lenders have become central to financing undergraduate degrees. Private debt has more than tripled in the last five years, to US$17.3 billion in 2005-06. And while Federal loans are capped at a 6.8% interest rate, private ones can soar as high as credit-card levels-20%. New legislation makes defaulting on such loans through bankruptcy virtually impossible. So even as students are increasingly being told-rightly-that only a college education can deliver a middle-class lifestyle, they are facing accumulated debts of US$100,000. And that’s before they enter professional schools to become lawyers or doctors, when they will need much bigger loans.

Shifting the burden onto students to be financially responsible for their education supposedly makes them keener learners, while encouraging additional scrutiny of the classroom is said to aid them in a space of traditionally unequal relations of power. But that Pollyannaish analysis will not do. First, as more and more funding in fact comes from private sources, it is they who are acting governmentally to ensure returns on their investments, both ideologically and monetarily. Second, the address of students as liberal agents both distorts their actual subject-positions, and under-prepares them for the obedience and absence of free speech required in most US workplaces, in addition to adding to the central power of has-been and never-were academic administrators over working scholars.

And what of those working scholars? The world of hiring varies enormously, based on the class structures that divide academia. My department is currently searching for two jobs. They are not in the sciences, or in professional categories that carry salary loadings. The candidates won’t be expecting, say, US$200,000 as start-up funds with which to build their research in the expectation of large grants that will help pay for university administration. Nor will they expect to be remunerated as though they were suffering the slings and arrows of opportunity cost by not working in corporate America.

I am speaking above of those privileged few who have tenure or tenure-track positions in Research-One schools. Most people teaching in universities are freeway professors who travel feverishly between teaching jobs, cobbling together a living, or folks working full time in second-tier schools with gigantic course loads. Inside the top universities, there is also great variety. When I was a full professor of cinema studies, American studies, and Latin American studies at NYU, I was paid four-fifths of the salary of the average starting untenured assistant professor in the law school, and one tenth of the salary of a particular advanced assistant professor in the medical school (she worked on fertility drugs, so this figure was not typical of her cohort). How did I know this? In the case of the law school, through senior people who told me. In the case of the medical school, even private institutions are obliged by Internal Revenue to disclose their top three salaries to public view. In general, divide-and-conquer is the leitmotif of these schools. However, the notion that one’s income is a matter of privacy is a technique for preventing employees from sharing information and hence being able to lobby collectively. This is aided by the Supreme Court’s Yeshiva decision, which holds that full-time faculty at private universities are managerial employees, and hence have no right to engage in collective bargaining, i.e. via a union. The wager that such schools make is that you won’t demand what you don’t know you can have.

One thing’s for sure. The negotiations for our current positions on offer won’t be as complex as those involving a guy I knew who moved to an Ivy League school a few years ago and told me that his new department had to work overtime to guarantee his US$500,000 a year personal travel budget. Nor will they equate to the person I used to work with whose deal promised her time and money for weekly visits to a different city to ensure continuity with her preferred therapist. And these discussions will differ from those entered into by thousands of adjuncts each year as they await last-minute phone calls and messages asking them to teach courses to hundreds of students, because full-time faculty are doing their ‘own’ work. The discussions won’t reference the experience of students looking for the ‘professor’ who taught them last quarter, who didn’t have an office, who won’t be back this year-and is forgotten by all concerned other than the personnel office, which has closed her file until the call goes out again for the reserve army of the professoriat to emerge from freeway hell in time of need.

And the future? Apart from the large number of undergraduate students and cultural-studies professors watching reality-TV shows, the idea of the makeover resonates monumentally with US colleges. Several high-profile schools have undergone huge transformations in recent times. The first instance was probably Duke University. Set up and supported by tobacco money and plantation history, the North-Carolina campus spent vast sums of money from the 1980s in order to elevate itself into the top echelon of Research-1 universities, hiring people from all across the world to improve its standing.

In the early 1990s, NYU decided to do the same thing. It embarked on a massive fundraising campaign amongst its trustees and others who were keen to make the scene as major benefactors in the Manhattan philanthropy set. Following Duke’s model, NYU decided that it needed to improve its standing in the basics of a university-the arts and sciences. It already had highly-ranked law and medical schools, but they are professional entities as much as research centres and do not generate scholarly esteem in the same way that mathematics and history can do, for all the power they exercise in the university and the wider society. Studies indicated that a massive influx of renowned faculty into the arts and sciences could have an immense and immediate impact on the quality of graduate-student applications, and then on to undergraduates. In less than a decade, NYU went from a second-rate commuter school to having top-notch students from all 50 states and half the world.

How were professors attracted to move? Huge salaries, New York City, buying whole departments to keep stars company, light or non-existent teaching loads, generous travel money, spousal hires, and a sense of making a difference. What was this like for those who were already in place? The Law School didn’t care-it had absolute independence financially and managerially, other than in the naming of a Dean. The Medical School was absorbed in its own version of a pressing national issue: what to with white elephants (AKA teaching hospitals). The low-rent professional schools, like Education and the Arts, were left out, because they didn’t fit the paradigm, and exercised little or no power on campus other than as public symbols. People who had toiled away in lowly-ranked arts and science departments were variously flattered and angered by the sudden appearance of superstars and their baggage of psyches, somas, libidos, and lofts.

The latest school to follow this model is the University of Southern California. Located in south-central Los Angeles, where the rebellion occurred after the Rodney King trial of 1992, USC has long been a bastion of wealthy, not-very-smart white students and faculty skirting an area of multicultural poverty. Again, it had excellent professional schools, and also boasted a renowned athletics department; but in the basic research areas-not so much. ‘USC’ was widely regarded as standing for ‘University of Spoilt Children.’ No longer. Nowadays, schools that it has raided for top talent refer to USC as the ‘University of Stolen Colleagues.’ All the money that comes each time the football team wins is now being cycled into buying the best faculty across the basic disciplines. In New York, the challenge was to look good alongside other private schools, notably the nearest Ivy League representatives, Columbia and Princeton. In California, the point of comparison is public schools, notably the University of California system’s leading lights, UCLA and Berkeley. It will be a while before USC can compete seriously with those testaments to the wisdom of public-cultural investment. But it will get there. If there is a lesson here, it is that the coarseness of commuter campuses and homely professors can be made beautiful. Money remaketh the university.

Neoliberal ‘reformers’ in other countries are fond of referring to the decentralized, mixed-market model of US colleges as a beacon. The truth is that this model’s success relies on long-established, disinterested ruling-class wealth, in the case of the Ivy League, and competitive boosterism by individual States, in the case of the public sector. When the actual costs of running universities are passed on to students, the results can be devastating. And the crisis contributes to a wider national problem of gigantic personal indebtedness. It does so in the context of governmentality and commodification-today’s recipes for academic hierarchy, Yanqui-style.

— this article by Toby Miller was originally written for Edu-Factory.org and has been reproduced here with permission of Toby Miller and Edu-Factory.org

Leave a Reply