Universities have a new benchmark to live up to in the uncapped system, with the pilot University Experience Survey revealing student satisfaction levels of about 80 per cent last year.
But experts caution against reading too much into the results, saying the pilot was designed to test new measurement tools rather than generate baseline data. The UES, to be released, found 82 per cent of students were satisfied with the quality of teaching and 79 per cent with their overall education experience.
It found 74 per cent were happy with assigned books and resources, 80 per cent were satisfied with online learning resources and 75 per cent found their courses relevant. But only 63 per cent found administration staff and services helpful, and just 49 per cent were impressed with support services.
Over a quarter of students said they received little or no support to settle in, with 10 per cent avoiding university services altogether. And 21 per cent of students expressed little or no sense of belonging to their universities.
The survey of first-year undergraduates, conducted last August and September, attracted 20,000 respondents across 24 universities. It produced similar results to the 2010 Course Experience Questionnaire, which revealed overall satisfaction of about 80 per cent.
But it found university students were less satisfied than their VET equivalents. Over 89 per cent of government-funded vocational course students reported overall satisfaction with their training last year. However Tertiary Education Minister Chris Evans said the results showed universities were “delivering high quality teaching and learning”.
“As more students enter Australia’s universities, we must ensure universities remain focused on improving the student experience by helping students complete their studies.” The executive director of Innovative Research Universities, Conor King, said universities would be “pretty happy” with the findings. But he stressed that the pilot’s purpose was “to create a way of capturing student experience that could give some reliable data over time”.
The pilot didn’t even attempt to “generate large response yields”, the report says. It says the goal was to produce data to test survey methods “and conduct psychometric analyses”. “Because of this, a representative and sufficiently large response yield was neither sought nor required,” the report says.
Mr King also noted that the pilot had been linked to a performance funding scheme that had since been scrapped. Last November the government dumped plans to reward improvements in student learning and experience. The payments were to have been tied to performance measures including the UES.
Mr King said the move had eliminated the main advantage of the UES. “It is the only such tool with the intent and purpose of being able to support funding allocations,” he said. The report acknowledges other student experience surveys including the CEQ, Australian Survey of Student Engagement (AUSSE), First Year Experience Questionnaire, Graduate Pathways Survey, Graduate Destination Survey, Postgraduate Research Experience Questionnaire and International Student Barometer.
Mr King said that as a general information source, the main advantage of the UES was that it was “reasonably succinct”. He said it struck a good balance between extracting information and alienating students. “If you want to get good responses from students, you don’t want to ask too many questions,” he said.
An education department discussion paper released late last year canvassed the possible scrapping of the CEQ, given its overlap with the UES. But RMIT University policy analyst Gavin Moodie said the CEQ had existed since 1992, and the UES was unlikely to be a sufficiently major improvement to warrant discarding 20 years of longitudinal data.
“The better course would be to adapt the CEQ again rather than replace it,” Dr Moodie said. Senator Evans said the UES would be trialled among all universities this year, with the results to be published on the MyUniversity website from 2013 if the trial proved successful.