55

July 2011

  • Send
  • Subscribe

OPINION

The virtues and limitations of monitoring reports

Dr. Salvador Cardús i Ros - Dean, Faculty of Political Science and Sociology, Autonomous University of Barcelona (UAB)

After all this time, it is difficult to find anybody who openly and radically dares to challenge the need for the assessment of academic work and for systematic and transparent monitoring. However, it’s one thing that no-one objects to it, another that, a) no-one is questioning the end use of the process; b), objections aren’t raised to systems which should allow for this; and c), that there are decision-making mechanisms that follow on from analysis and committed proposals, but that those involved pay absolutely no attention to them. I’ll leave to one side the fact that the first two types of resistance have to do with a certain work ethic – or lack of – in public institutions that would require more specific and extensive consideration to go on directly to my actual experience with drawing up the monitoring report and the virtues and limitations that I have observed.

Our experience at the Faculty of Political Science and Sociology with the first monitoring report, regarding to the first academic year of the new Bachelors in Political Science and Public Management and Sociology, and all of the Master’s programmes in the faculty, which I must say was good, even very good, in the first stage of the process, i.e. when it was being drawn up. Transparency in relation to the main problems in the way that teaching is organised wasn’t something new to us, as we’d already done various exercises beforehand which had been opened up to public debate via the newsletter to the entire community, in the workshops held every year dealing specifically with teaching that have been organised for some time now, in extraordinary meetings of the faculty board. A quality committee was even set up. So a laid-out formal instrument for the further enhancement of teaching, as to creating any kind of conflict, was more of a new way of channelling an ongoing concern. In this respect, it never felt like any kind of accountability to above – the university (UAB) and the QA agency (AQU) – but more like a new opportunity to make our concerns evident downstream, i.e. to the teaching and student community.

In terms of the indicators proposed for the monitoring report protocol, inasmuch as they are standard and comply with strict design processes and therefore, in addition to being reliable, allow for comparison with other programmes, they seemed to be perfectly suitable for the proposed aims. In actual fact, they are the same ones that we ourselves had decided to use with the university’s database. It is probably a good idea that institutions are able to add extra data. For example, at the UAB, it would be interesting to be able to analyse the students’ place of origin and that of residence, which is data that is of particular importance for detecting specific problems related to timetabling and individual working conditions, etc. Other figures that should be included are employment-related aspects – and their actual nature and extent – of students while at university, and reports and surveys on professional employment later on. It is true, however, that the available figures on this are still partial, irregular and sometimes unreliable.

On the other hand, the work of reflection through to qualitative assessment was relatively easy for the abovementioned reason: it is something that had already been done in response to different requests in the faculty. The active role of the Bachelor and Master’s programme coordinators, discussions with the respective committees, the collaboration of administrative and service staff and the overall synthesis by the vice-dean for Academic Affairs in the end produced a very honest and fully developed report that was produced within the time set by the university. Problem detection was done by consensus, proposals for action were well thought out and the responsibilities well defined. No changes were made following the presentation of the report to the faculty board and information that needed consideration was made clearly apparent. As a result, the report proved to be highly useful and will be even more so for assessing how commitments have been fulfilled and in terms of trends in the available data and figures.

Nevertheless, and precisely because of the satisfaction I get from my entire team’s commitment to their work, I must confess to feeling profoundly annoyed in relation to our ability to act in accordance and on a level with the objectives that stem from this type of exercise. If they don’t want this to become an annual routine in which a kind of nominalist cunning develops that progressively erodes the sense of responsibility in the institution when making important and effective decisions, establishing priorities and attributing responsibilities, there will need to be sufficient authority for commitments to actually get implemented. And it is here where the initial willingness that I’ve mentioned does not correspond with the actual possibilities that we have to carry them out. The obstacles, a word used in general terms, are of two types. One, the enormous difficulty of broadening the scope of the content of the monitoring report to actively include the community of teachers as a whole. There is actually effective collaboration between the programme coordinators and the corresponding committees, although all in all this only involves around twenty people, including students, administrative and service staff, and teaching and research staff. The presentation to the faculty board involves a few more who end up informed – most of the twenty also form part of it – but this is clearly insufficient in relation to the main body of over 150 teachers who should be involved and concerned in general about learning outcomes and the proposals that have been defined. The problem is a serious one in that it goes beyond the efficacy of the monitoring report and has to do with the other obstacle. And that is the thing that prevents something like the monitoring report from having the effect that it should is the authority of the dean’s team to encourage more responsible and involved action from the academic community as a whole in the faculty’s general projects.

Ultimately, communication strategies are needed to bring the community together around corrective and enhancement measures such as those that are set out in the monitoring reports, but above all effective management mechanisms are needed so that, in particular, teaching staff feel involved in the faculty’s future and participate not just in carrying out analyses, but above all in applying the decisions made by the university’s governing bodies, both at the university level in general and at the faculty or institute level. My impression is that the monitoring report is, and will be, really useful in terms of accountability. On the other hand, I have more serious doubts about my own ability to get those around me, the ones who have to produce the enhancements proposed in the report, involved in the process.

ENQA EQAR ISO

Generalitat de Catalunya

Via Laietana, 28, 5a planta 08003 Barcelona. Spain. Tel.: +34 93 268 89 50

© 2011 AQU Catalunya - Legal number B-21.910-2008