by MERLIN CROSSLEY

At the end of every teaching session all UNSW students are invited to provide feedback on their experience of each Course (i.e. each subject, like Maths 101, or History 201) via the MyExperience Surveys.

We recently, set up a dashboard to provide graphical summaries to staff and students.

The good news here is that we are listening to the student voice and respecting students’ reports on their experiences. I also think it will help us to showcase and celebrate some of the outstanding courses that are really appreciated by the students. We can all learn something from these courses.

At the same time, we can look at courses that are not yet delivering to the expectations of students and check whether there are any problems, or whether it is just that sometimes learning “essential Latin grammar” isn’t that popular! But we check under-appreciated courses already, so there is no real change there.

But are there any concerns about the dashboard? Is this metrics gone mad?

I think we have to be cautious, and we will be.

Firstly, we deliberately called the student surveys – MyExperience Surveys – to indicate these were surveys about an experience, not about teaching quality, effectiveness, or learning. We have many other ways of assessing learning, and peer review and mentoring for teaching.

Secondly, although we collect feedback on the course and on the teacher, we are not making feedback on teachers available. We will only release data on courses, and only those courses that are taught by multiple teachers, so no one can point fingers, good or bad, at individuals.

Thirdly, we will not publish data when the sample size is too small to be meaningful. We care about response rates and the number of responses.

Fourthly, we are only showing graphs with the usual – strongly agree, agree, neutral, disagree, strongly disagree etc Likert Scale. We are not publishing the comments.

Some comments can be problematic and betray biases, unconscious or otherwise, that exist in society. We explain to students that the surveys are confidential, in that we do not reveal the name of students providing feedback to their teachers, but they are not anonymous. Anonymous comments can be problematic. We screen our surveys for inappropriate comments and can take action if our community standards are breached.

But shouldn’t we eschew student feedback altogether as multiple studies have detected sexism and racism in surveys? We have carried out a number of studies on this topic. Our community, is no different from any other community, we are not free of biases. But discussing all this openly helps us to be alert to biases and context when interpreting student feedback. It doesn’t seem right to suppress the student voice just because students, like everyone else, are not free from biases.

What about staff anxiety? Will this increase if student experience scores are made available to our community?

I recognise that some staff do worry about this, but my expectation is that many will take comfort from the fact that their courses are good and broadly in line with others. I’ve looked at the graphs and the vast majority are similar, and they are good. In my career I have noticed that anxiety occurs more where there is an absence of information on achievement. Visible data can provide the type of high performing academics we employ with security and boosts their confidence.

Should we disregard student feedback since students are not in an ideal place to evaluate their own learning? A good experience is one thing, but isn’t learning the main thing?

I believe that in an ideal world we would aim to have both. I have seen many superb teachers who manage to drive learning, while also providing a good student experience. I think that is what we should be aiming for. Torturing students to make them learn might work too, but why do it if one can gently coax students to success.

Finally, won’t focussing too much on the “student experience” lead to soft marking and a decline in academic standards?

Frankly, that is the last of my worries. If anyone really thinks our standards are declining, I invite them to try one of our electrical engineering courses. Sometimes I worry our high performing academics expect too much from our students, and push them too hard. It’s also important to note that the surveys are confidential and disconnected from the assessment period, so I am not worried about them causing soft marking.

When we consulted staff and students about this how did they respond?

The students were supportive and had long been pushing us to make their own feedback data available to them.

What about the staff?

I first published course data when I was dean of science at UNSW in 2014 and remarkably few people noticed. When we announced we were planning to do it university-wide last year a small number of staff raised concerns. The Fair Work Commission was asked to decide whether we were correct in thinking it was in line with our Enterprise Agreement. The initial finding was that our Enterprise Agreement forbade it. We appealed and were told that our Enterprise Agreement allowed it.

Where are we now?

We now have permission to publish the high-level graphical data but I am also now very aware that some staff have concerns. I hope to be able to reassure those staff – who I know meant well when they sought the advice of the Fair Work Commission – that we mean well too. I feel that this is important. To be blunt data on research achievements are everywhere. But some of our most appreciated courses are invisible to our community. Now staff and students will be able to see how well-appreciated our courses are. Finally, we can celebrate the good work and learn from it.

Where we find a problem, won’t managers like me now step in and ask a team of educational mentors to help improve the student experience in the course? Yes. But we do that already now. It’s just that in the past only line managers could see the dashboard, now our whole community can.

Not much has changed but sunlight is flooding in and highlighting some remarkable achievements related to the student experience at UNSW. You don’t need to be a botanist to know that sunlight makes good things grow. The information will help our best courses to become better and our developing courses to develop quickly. And we will aim to be careful as we develop our educational garden.

Professor Merlin Crossley

Deputy Vice-Chancellor Academic

UNSW Sydney


Subscribe

to get daily updates on what's happening in the world of Australian Higher Education