Joint authors: Terry Wickenham and Joanne Nash
Students disengaging from course content is a major concern for educators. At the University of Sydney Business School (USBS), personalised emails were used to intervene and connect with students who were not engaging with the content of BUSS2000 Leading and Influencing in Business. We thought of it as a digital nudge. Below we investigate the use of early feedback for disengaged students from the tutor’s point of view.
Early feedback is important
Important to note: strong pedagogical support for early feedback increases student engagement and belonging by providing a clear link between effort and performance (Yorke, 2001; Tinto, 2012). It also improves students’ subsequent performance (Vaughan et al. (2014)
And now it’s mandated by the government
From the first of January 2024, the Higher Education Support Amendment Bill 2023 requires all Australian universities to create a Support for Students policy, which includes a pre-census task with feedback. This change addresses the need to support and assist students who are at risk of not successfully completing their units of study. This is a mandated action. However, some teachers have already been “nudging” students to engage.
So what is nudging?
Nudging theory, developed by Thaler and Sunstein (2008), is a concept that focuses on influencing people’s behaviour with subtle cues that encourage people to make decisions in their broader self-interest but without monetary incentives.
In education, nudging involves using subtle cues and interventions to guide students toward making positive choices, such as engaging with course materials and participating actively in class (Lawrence et al. 2021; Brown et al. 2022). It has been used to manage student involvement expectations, hold their attention, and motivate them to engage in essential assessment behaviours.
Weijers et al. (2020) emphasise the importance of tailoring interventions to specific contexts and target behaviours and highlight the potential of nudging to influence student behaviour in areas such as academic performance, attendance, and study habits.
The digital nudge
Digital nudging to influence student behaviour has gained attention in higher education. Plak et al. (2022) discuss the use of digital nudges tailored to students’ motivation and perceived ability levels to increase student engagement in online activities.
Low levels of online student engagement impact negatively on student success and adversely affect attrition. Course learning analytics data combined with nudging initiatives, have emerged as strategies for engaging online students (Lawrence et al, 2021).
A tutor perspective
In this case study, the tutor used learning analytics and class engagement to identify students and send emails via the Learning Management System (Canvas) messaging and personalised emails using the student relationship engagement system (SRES). SRES helps teachers personalise engagement with large student cohorts. Two examples of interventions used by the tutor:
Problem A: Students not completing pre-work before workshops
The Problem: BUSS2000 is a large core second year leadership course and there’s significant pressure on academics to manage hundreds, if not thousands of students. It has multiple prework activities students are meant to complete, but these are not graded. Too many students skip them and come to class unprepared.
Intervention: The tutor implemented a digital nudge in the early weeks of the semester, targeting students who had not yet completed their weekly prework activities including a reflection. Canvas messaging was used to highlight the importance of prework, emphasising the direct correlation between preparation and in-class success. Here is the message sent:
“Hi (students first name), I want to help you progress in BUSS2000. Students do well when they come to workshops having already considered the material before we meet. If you need any help preparing for week X, please email me and we can have a chat via Zoom at a time that suits.”
Outcome: This proactive approach motivated students to complete their pre-work, fostering a sense of responsibility and connection between effort and performance. Students commented on the reflection task being especially helpful in allowing them to improve on areas of weakness in their next reflection.
Problem B: Identifying students needing support
Problem: While most students successfully completed prework and reflections, a substantial number, particularly from the international student cohort, faced challenges engaging in workshop activities.
Intervention: Employing weekly engagement metrics, the tutor identified students requiring additional support to prevent them from being at risk of failure. Tailored emails were dispatched to students who missed classes or participated minimally, providing personalised guidance throughout the semester. Here is one example of minimal participation:
“I appreciate your participation in today’s discussion. The best learning is achieved when we contribute our unique perspectives. To foster this further, I encourage you to seize opportunities in upcoming workshops to share your insightful understanding of the course theories. This is certainly worth trying!“
Outcome: Students expressed increased motivation due to the feedback received, and stated that this was helpful, displaying the tangible benefits of the tutor’s intervention.
Does contact by email work?
There is a lot of discussion about learning analytics and how effective or ineffective they are. An over-reliance on data can lead to a narrow focus on easily quantifiable metrics, potentially neglecting important aspects of education including pedagogical knowledge (Thoeming et al, 2023).
We know that emails to students are often not opened. At the University of Sydney:
“The emails sent by program coordinators resulted in an average open rate of just under 30%, and approximately 20% of the students contacted responded either with further information or a request for support.”
(Thoeming et. al, 2023)
Not all students open emails. However, with a limited number of tools that can be effectively used with large cohorts, a response rate of 20% is a significant improvement.
So why listen to tutors?
Most discussions around the use of learning analytics, and student relationship management systems (SRES) are written by academics or learning analytic experts. This case study provides insight into the tutor perspective and how early feedback can make a difference to students in large cohorts. tutors are at the coal face, in front of students delivering the tutorials, they have the most contact with students and are well placed to finetune early feedback and nudging prompts that suit the specific needs of their students.
The proof: the student feedback survey score for the tutor was almost perfect. Would you like to give nudging a go or learn more about early feedback? Consider putting support for students at the centre through early assessment, feedback and personalised communications. Early feedback for disengaged students is essential. We all need a nudge sometimes. And it might just help your students successfully complete their units of study.
Jointly authored: Joanne Nash and Terry Wickenden
About the author
Jo Nash is an Educational Designer with Business CoDesign at the University of Sydney. She brings a practical and strategic approach to educational design with extensive experience in unit coordination, lecturing, marketing, advertising and strategy. She is particularly interested in the codesign of units to help teaching outcomes, plus enhancing academic integrity at the same time as student learning outcomes.
