[csaa-forum] ERA rankings

Danny Butt db at dannybutt.net
Tue Jul 8 22:23:40 CST 2008


Hi all

Very interesting discussion, and I'm pleased to see the group active  
on coordinating responses, which I think is a very powerful  
intervention.  I'm currently engaged in research on research  
assessment systems as they relate to creative practice disciplines  
(esp art/design/music/screen production) so a couple of observations  
from that recent itinerary.

It's interesting to see the ERA going the journal assessment route.  
Both the UK RAE and NZ's PBRF explicitly avoid this to focus on peer  
assessment of research, and try and maintain a distinction between  
"quality of research" and distribution platform, as if there might be  
a brilliant scholar toiling away in low-impact journals doing great  
research that might one day be seen to be revolutionary in the field,  
which is probably one of the great psychobiographical narratives about  
research that even my father, who never went to a university but reads  
pop science, holds as dear as any of us. I'm sure that in CS-related  
disciplines we've all experienced exclusions from the administrative  
hierarchy that give us some investment in those narratives too. And  
there is a certain truth to it.

On the other hand, there is a dynamic where people do use the  
gatekeeping model as shorthand in assessing the "level" someone is  
working at. If someone has a solo show at the Tate Modern, that  
doesn't mean that I will necessarily *like* the work more than a  
project at my favourite artist-run space, but it is in some sense a  
*quantitatively* different set of obstacles to negotiate that demands  
a certain level of respect. In both RAE and PBRF it is widely believed  
that assessors use journal impact factors in an informal way for  
assessment despite the guidance to ignore it.

I have to say that as far as these kinds of assessment exercises go, I  
can't say for sure that journal impact factors are worse than direct  
assessment of research outputs by "expert" panels. That's not to say I  
like the idea of journal assessment, and I do think that the draft  
Australian rankings are full of egregious errors that have already  
been pointed out. But I think that in humanities/social sciences  
disciplines, academics tend to group around shared methodological and  
ideological interests, and I am certain most people have had the  
experience of a "review" that basically consisted of a critique of the  
whole rationale for the project, rather than any of the detail about  
how the research was carried out. The whole question of "problem  
selection" (what problems are considered important to study)  and  
attendant methodologies is different in the science/technology/ 
medicine discourses that initiated these concepts of peer review and  
impact that are embedded in the ERA process. But in our disciplines it  
is kind of central to the stakes of the project. The projects I find  
fascinating will be of no interest, or be actively hostile to the  
world-views that shape other projects. In this scenario, having an  
expert review your work individually is potentially to simply put the  
people appointed to the panels (likely to be established scholars from  
more established institutions who cut their teeth a generation  
previously) in the position of reviewing work which is, often, aiming  
for precisely an overturning of the paradigms set in place by those  
assessors.

My anecdotal accounts from the NZ experience is that some individuals  
in humanities, social sciences and creative arts clearly suffered from  
that kind of bias, even (or perhaps especially?) where they were  
internationally recognised and interdisciplinary. In that respect,  
going into bat for the journals we value during this process is  
probably easier than dealing with a cursory dismissal of one's  
individual portfolio of research by a largely anonymous process.

Whether Guy's comments are a conspiracy theory or not, there is no  
doubt that this exercise will be used as a policy lever to achieve  
institutional reform. In NZ the implementation of the PBRF coincided  
with a firm re-division of the NZ tertiary sector. Not necessarily  
into teaching vs. research institutions, but into a University sector  
whose research could be evaluated on quality, and an Institutes of  
Technology/Polytechnic (ITP) sector which would be evaluated on  
knowledge transfer / linkage / impact. This has proven to be  
problematic not only for the ITPs who are essentially placed into the  
second division, and were actively discouraged from participating in  
the research assessment process the first time around. The bifurcation  
has also made it difficult for more applied researchers working in  
research universities who have understood that the most innovative  
work in their disciplines has been precisely around the edges of the  
university/external interface. This is especially true in "newer"  
disciplines such as sports science, nursing, clinical psychology and  
design. All of which, like CS, have a history in polytechnics and  
applied learning institutions. I wouldn't underestimate the forces  
that might want to see some kind of a rollback of the Unified National  
System in Australia.

Finally, while I don't really think Ned's suggestion of refusal is  
particularly viable right now, I do love his followup idea of stacking  
the deck to blatantly support nationalist research. It really cuts to  
the core of the issue, which is that the idea of education as a  
benefit for the world has been thoroughly transmuted into a benefit  
for the self. Along those lines, patriotic cultural nationalism (or  
its offspring, national economic development) is increasingly the only  
politically viable justification for government funding of  
universities. As budgets shrink and participation rates expand,  
politicians need to both trim budgets and address the perceptions that  
academics are both not useful, and likely to be unpatriotic because  
they spend too much time in foreign countries. By clearly valuing  
research of national benefit we might lose our capability to  
participate in the international journals that lead our field, but  
provide much-needed support to local publishing infrastructure.  
Perhaps the internationalism of Fibreculture Journal or borderlands e- 
journal could be seen to be aiding a national intellectual export  
economy?

Cheers,

Danny

--
Danny Butt
Lecturer, Critical Studies
Elam School of Fine Arts, National Institute of Creative Arts and  
Industries
The University of Auckland
Private Bag 92019, Auckland, New Zealand	| http://www.creative.auckland.ac.nz
http://www.dannybutt.net
Ph: +64 9 373 7599 x 89922
+64 21 456 379

On 8/07/2008, at 7:42 PM, Guy Redden wrote:

> I've just finished an article about the RAE/RQF/ERA, reading them as  
> neoliberal performance management frameworks. It looks at public  
> choice theory and other trends in public sector managerialism, and  
> considers the UK RAE (under which I previously laboured) in relation  
> to current/proposed Australian systems. As I've submitted it to a  
> 'B' journal, it's probably of dodgy quality, but if anyone is  
> interested I am happy to forward it off-list.
>
> IMHO, the ERA's current research outlet rankings (and they are also  
> planned for monograph publishers, conferences, etc, not just  
> journals) will open a big can of unintended consequences. Not only  
> is publication venue a questionable proxy for quality, but academic  
> publishing is a broad ecology that includes many kinds of  
> publication, in which people publish for a range of good reasons.  
> Once the A*, A, B, C, tiers become a central focus in the  
> micromanagement of research within institutions, that whole ecology  
> could be distorted in ways not yet known.
>
> The stampede for A* and A outlets will no doubt produce more  
> Australian publications in them, but at what cost? E.g. whither  
> specialisms? The lists of rankings privilege high-profile ‘general  
> interest’ journals within disciplines. If you happen to work in a  
> field with specialized journals ranked B and C, via which your peer  
> community works though its core issues, what will you do? Accept  
> your designation as a second-class citizen, or give up the work you  
> are dedicated to by vocation in order to strategise about ways of  
> getting into the top tiers? For similar reasons, whither  
> interdisciplinarity? And whither place-specific work. The problem  
> for law, where many publications are directed towards jurisdictions,  
> has been summarized in a piece in the Australian:
> http://www.theaustralian.news.com.au/story/0,25197,23921819-25192,00.html 
>  (most of the ‘top’ journals are inevitably focused on U.S. law).   
> This is likely to be a problem across the humanities and social  
> sciences as publications addressing local issues have smaller  
> readerships/citation counts, meaning they are less likely to be  
> deemed ‘top’ in their fields. This would appear to be a bizzare bias  
> against important Australian-specific research and associated  
> outlets. We are entitled to ask what the public good is of this  
> arbitrary valuation of ‘disciplinary internationalism’ over  
> everything else.
>
> A real concern is the prospect of outlet ranking being used to  
> determine research capacity funding. The ERA is openly being talked  
> about as the successor to the RQF, and the latter was designed as a  
> means of concentrating more research funding ‘at the top’ by  
> bringing more quality-related performance criteria into the funding  
> formula than the current HERDC block grants. Although the ERA is  
> shorn of cumbersome peer review of publications and impact measures,  
> it maintains the fundamental quality ranking approach of the RAE/ 
> RQF. The double whammy will come if/when politicians decide what  
> price to put on outlets. I have no idea what the values will be, but  
> imagine they use Who Wants to be a Millionaire logic by rewarding  
> outputs as follows: C=$1,000, B=$2,000, A=$4,000, A*=$8,000. (This  
> would actually be generous to the lowly in comparison with the RAE,  
> which had no funding for the lowest rankings of publications, i.e.  
> 'C')
>
> So what? Possible answer: the Bradley committee review of higher  
> education currently underway is explicitly examining the question of  
> teaching vs. research, which can be decoded as ‘should we revert to  
> a sector of teaching-intensive/only and research-intensive  
> institutions’? (http://www.dest.gov.au/sectors/higher_education/policy_issues_reviews/reviews/highered_review/ 
> )
>
> An under-the-radar way of doing this would be to retain a patina of  
> neoliberal meritocracy (all unis are eligible to compete for  
> research funding) while ensuring quality ranking diverts the lion’s  
> share of funds to institutions that are already best placed to earn  
> from the system. If the prices applied to B and C are so low that  
> they do not actually cover real costs of research, institutions that  
> attract those levels of funding in the aggregate will start to lose  
> large amounts in undertaking research, and will have an incentive to  
> give it up.
>
> I acknowledge this may be a conspiracy theory at present, but it is  
> not beyond the realms of possibility. If it comes to pass, it would  
> be a knowledge class system, and not very cultural studies…
>
> Cheers,
> Guy
>
>
> ----------------------
> Dr Guy Redden
> Lecturer in Gender and Cultural Studies
> School of Philosophical and Historical Inquiry
> University of Sydney, NSW 2006, Australia
> Tel: +61 2 9351 8495, Fax: +61 2 9351 3918
> Office: J4.03 Main Quadrangle (A14)
>
>
>
> -----Original Message-----
> From: csaa-forum-bounces at lists.cdu.edu.au on behalf of Ned Rossiter
> Sent: Tue 7/8/2008 5:03 PM
> To: CSAA discussion list
> Subject: Re: [csaa-forum] ERA rankings
>
> after an exchange off-list, I've decided to qualify my earlier  
> posting.
>
> I appreciate the strategic need and rationale for nationally
> published, independent journals like MIA and also Arena, Meanjin,
> Overland, Fibreculture Journal, etc to be playing the ERA game.  And
> in that sense, the case and success of MIA is one I would support.  I
> didn't mean to dispute the quality or role of MIA (it is one of the
> few journals I have subscribed to, and intend to renew once I settle
> into wage labour again). My post was blunt to the point of reducing
> the complications/complexities that operate within the research
> funding/award system. Part of my intention was to hint at the
> (national/local) politics/vested interests that have obviously shaped
> the ranking outcomes.  The farce of impartiality is something that I
> find very difficult to take seriously in such exercises (many of
> which, it has to be said, define institutional life in academia), and
> because self-interest can never be declared as such, the academic
> community is supposed to accept a system that is, I would still
> maintain, inherently flawed.  For that reason I would reject it.
>
> I'm also aware that it's easier to adopt such a position from outside
> Australia (though I remain institutionally connected/affiliated in
> Australia and therefore subject to its funding/research system).
> Nonetheless, I do not foresee the outcomes of the ERA determining
> where I (or others) publish, despite the documented funding/
> institutional ramifications.
>
> An alternative position: given the rule of the bell curve why not
> insist that all nationally/locally/independent published journals are
> assigned A* (MIA, Meanjin, Arena, FCJ, etc), journals that
> predominantly feature Australian academics get an A-B, and all T&F,
> Sage, etc journals get a C.  T&F journals like Continuum fall into C,
> but because they publish the work of many Australian academics they
> might as well get an A or B.
>
> A proposal of this kind is as rational as any other, it supports
> local publishing industries, and it keeps many people happy. And it
> would ensure that the ERA as a system of self-interest and silly bell
> curves remains intact (which it no doubt will anyway).
>
> Ned
>
> _______________________________________
>
> csaa-forum
> discussion list of the cultural studies association of australasia
>
> www.csaa.asn.au
>
> change your subscription details at http://lists.cdu.edu.au/mailman/listinfo/csaa-forum
>
> _______________________________________
>
> csaa-forum
> discussion list of the cultural studies association of australasia
>
> www.csaa.asn.au
>
> change your subscription details at http://lists.cdu.edu.au/mailman/listinfo/csaa-forum








More information about the csaa-forum mailing list