Higher Education Academy Surveys for Enhancement Conference

The Higher Education Academy Surveys for Enhancement Conference took place on 17 May 2012 at the National College for School Leadership in Nottingham. The conference sought to explore the use of surveys for the improvement of learning and teaching. The conference hosted three parallel themes: using qualitative data to illuminate survey scores; developing and measuring student engagement; and surveying postgraduate students.
Professor Keith Trigwell, University of Sydney, opened the conference with his keynote: ‘Interpreting the students’ message: Using data from learning experience surveys.’ Trigwell offered a strong message in this session: learning experience surveys are not just about student satisfaction, but should also be about what is learnt. Drawing on his experience of administering the University of Sydney’s Student Research Experience Questionnaire, Trigwell outlined how good surveys measure things that are indicators of what we desire and provided a call to action as to how learning experience surveys can improve learning and teaching. He suggested that there needs to be a focus on the overall culture (rather than discriminating against an individual teacher); change should be sought even when average scores are achieved; scales, item scores, trends and comparisons can trigger action; written qualification can be used for clarification (in his case, however, qualitative comments are sent in raw form to programme teams); departments should be asked publicly to state what has been done with the feedback; and information and results should be returned to the students. He noted that it was impossible to act solely on survey data alone as interpreting the student learning experience was complex. One leaving message for me was that sometimes changes can be made to a learning context, but that student perceptions (what are measured in such surveys) might not change. It may, then, be necessary to work on changing student perceptions of their learning experiences.
The second keynote was from Gwen van der Velden, University of Bath. Her session, entitled: ‘Institutional NSS findings: From statistics to the student experience’, showed how Bath University has drawn on the findings from the NSS through a cycle of: departmental data analysis, action planning, departmental reviews, and empowering students views. For van der Velden, the student engagement question (B6.3): ‘It is clear to me how students’ comments on the course have been acted upon’ was the most important. This was where enhancement activity focus lay. Students were well represented on faculty learning and teaching committees, staff-student liaison committees, and all students received an overview of the ‘Student Voice’ during their induction. Van der Velden, who leads Learning and Teaching Enhancement, noted that her agenda came through the staff-student committees and not from the Deputy Vice Chancellor. For her, change happened in the coffee bar chat; in a later tweet, she noted : ‘by time policy comes, change already there. It’s how we academic developers work’.
In addition to the keynotes, I also attended three workshops in the parallel sessions. In the first, Alex Bols, NUS, invited the audience to consider the information needs of postgraduate taught students (PGT) in light of the QAA’s Quality Code push for clear information for students and calls for a PGT NSS. The discussion recognised that PGT are the forgotten sector of higher education but that it would be very difficult to provide information sets for them due to the fast-moving nature of the study they follow. The second workshop, led by Dr Sarah Lewthwaite and Dr Camille Kandiko from King’s College London, focussed on measuring student engagement. The workshop drew on participants’ experiences of the impact of student satisfaction surveys (e.g. NSS) and the challenges and opportunities of surveys based more on engagement (e.g. the American NSSE). Our own small group questioned whether institutions’ policies of designing internal surveys to mirror NSS could ever truly enhance learning and teaching. Instead, complementary surveys or approaches could influence meaningful change. The final workshop, from Sarah Wilson-Medhurst and Tim Davis, Coventry University, explored the use collaboratively generated indicators to evaluate learning and teaching innovation (action-led learning). The attendees were asked to consider whether the approaches used in the evaluation met their aim.
The conference provided a reminder of the importance (and predominance) of satisfaction surveys in contemporary higher education, while also suggesting that student engagement surveys might well provide more meaningful data in the quest for learning and teaching enhancement. The keynote sessions were engaging and interesting. I had chosen to attend three workshops and while I benefitted from the discussion with colleagues, I felt I would have learnt more from a straighter presentation approach.

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *