The summer Pre-sessional period is drawing to a close with institutions heaving a huge sigh of relief that they have managed more or less successfully to deliver their programmes online in the wake of Covid19 lockdown. Having written that last sentence, I realise it is full of dead metaphors (highlighted in italics) that writers often fall back on to connect with their audience. I’m not writing creative fiction here so these well-worn structures are useful to get us on the same page. (Now I can’t stop myself!) It got me thinking though about whether we aren’t guilty of doing the same thing when we craft feedback surveys for students and teachers at the end of Pre-sessional English (PSE) programmes. Do we fall back on the same questions without stopping to critically evaluate how appropriate or useful the responses will be?
I’ve written quite a lot of these surveys over the years for PSE programmes but also for teacher training events and for research purposes (Alexander, 2012; Alexander et al., 2017). It’s not easy to craft questions to get the answers you are looking for or to get real insights into the concept you’re exploring. Novice researchers, such as the students I’ve taught on Research Preparation courses, often expect to start doing research by writing a questionnaire, possibly thinking of market research surveys they’ve been asked to complete. I’ve done this myself, committing the two cardinal sins of questionnaire design: I couldn’t provide a theoretical justification for the specific form of the questions I was asking and I hadn’t thought about how I would analyse the answers.

Image by Methawee Krasaeden from Pixabay
I’ve since learned to be more strategic when designing feedback surveys for PSE programmes. In line with genre theory, I have an eye to the audience and purpose for the feedback results so I ask questions that will give me answers I can use as evidence to drive changes or alternatively resist changes. For example, several years running, temporary summer staff at my institution experienced a lot of difficulty being paid correctly and on time. So I included a question in the staff feedback survey to ask about university services such as HR and payroll. I could then use the predictably negative responses to call for changes in HR and payroll systems. For a number of years now, PSE programmes have used a coursebook I wrote. There is a knee-jerk reaction amongst some teachers against coursebooks as they want the freedom to use their own favourite lessons, irrespective of whether their materials would deliver appropriate learning outcomes or support the assessments. To counter this, I included a question in the student feedback survey about how well the coursebook had supported their learning. The results were almost always positive and could be shown to teachers returning in subsequent years to justify continued use of the coursebook.
This year with the migration of PSE programmes online, the biggest challenge was convincing teachers that the reduced number of face-to-face teaching hours was appropriate for the new online delivery. The students were told before the start of the programmes that they could expect to spend three hours preparing for each synchronous session, using PowerPoint slides with a teacher voiceover. They were also told the length of the synchronous sessions and that there would be asynchronous interaction with their teacher through discussion boards, email and personal blogs. Several questions in the student feedback survey thus focused on the effectiveness or otherwise of this different type of teaching. These questions were asked in the following order using quantitative (multiple choice or agree/disagree) responses. The first two questions prime the response to the third and we might have got a different response to question 3 if it was asked first.
- How many hours did you spend preparing for the Collaborate sessions?
- Would you prefer the preparation slides with voice-over or without voice-over?
- Doing the exercises in the preparation slides increased my participation in the Collaborate (synchronous) sessions.
Of the 252 students who completed the feedback survey, 80% claimed to spend at least 2 hours or more preparing; 68% preferred the Preparation PowerPoint to have a voiceover (some teachers disliked this aspect of the preparation) and 91% agreed that doing the preparation helped them to participate in the synchronous sessions. This data constitutes powerful evidence to resist any attempt to make significant changes to the structure of the programmes.
These kinds of quantitative questions are easy to analyse and, in fact, the survey tool will crunch the numbers for you. The more difficult questions to analyse are the open questions, which ask for a written response, e.g. what was the best thing about the PSE Online programme? This requires the survey designer to work through all responses categorising them and counting the frequency of rsponses in each category. It does give rise to deeper insights, however. So in answer to the question: What is the best thing you will remember about the PSE Online programme? most students responded – my teacher. However, one or two were able to provide deeper insights:
Learning in advance, reviewing in time and learning to summarize is the most effective self-learning for anything.
What impressed me most about the online language class was to preview the homework in advance, because it was very different from the Chinese way of teaching. I think I learned a lot of learning methods and I hope to make progress.
Alexander, O. (2012) Exploring teacher beliefs in teaching EAP at low proficiency levels. Journal of English for Academic Purposes, 11/2, pages 99–111.
Alexander, O., Sloan, D. Hughes, K. & Ashby, S.A. (2017) “Engaging with quality via the CEM model: Enhancing the content and performance management of postgraduate in-sessional academic skills provision”. Journal of English for Academic Purposes, 27/1, pp. 56.70.