Lecturer's blog 11: "A journey to encourage active tutorial participation in large cohorts" by Andrea Kuiken
Nowadays, for many courses students are stimulated to come to the tutorials by providing a grade or a bonus score for their active participation during the tutorial. However, in my experience active participation tends to equalize to the student being awarded for being present. Tutors need to know all names of their students (or use name tags), and on top of that do not only need to focus on the content delivered and discussed, but have to multitask and also administer contributions of students to in class discussions which is not always easy. Not even talking about the emails from students saying that they were actively contributing, but that you missed taking note of it.
When I started teaching in the course Introduction to International Business (1st year bachelor course with 400-500 students), we asked students to answer a set of questions within their team, submit these before the tutorial and we discussed these during the tutorial. However, only a hand full of students were well prepared and actively contributing. We came up with multiple reasons for this:
- the questions were answered in groups, so in some groups of three students each only knew the answer to some questions,
- students feel uncomfortable speaking up in a room of 30 students, and
- we received feedback from students that they did not see the added value of going through the same questions again as they had already answered.
In this blog, I walk you through the journey that I have gone through so far after inheriting the course, to address mainly the first and the last challenge, experimenting with different tools and set ups.
As for all courses, Covid resulted in rethinking of our course. A main question that we as teaching team had was how to ensure that students interact as much as possible with the course material, while at the same time active participation is hard to check online since we rarely could see the students in Blackboard collaborate. So what we did:
- All questions that were previously asked before tutorials were still provided to students at the end of short lecture videos. Students were informed that they could answer these questions before coming to the tutorial and if they had questions or were uncertain about an answer, they could either asked it in a discussion forum or during the tutorials.
- Before the tutorials students had to answer a set of 5 multiple choice questions in a quiz that were representative for the course material and students were given the right answer straight away with an explanation of why this was the right answer. These quizzes counted towards the active participation score, following the logic that being able to answer questions means that you were prepared. Before the tutorials we checked the scores before the tutorials and discussed only those questions that were poorly answered during class.
- Use PollEverywhere and discussion statements with breakout groups during the tutorials to facilitate active participation during the tutorial.
As with every innovation, these changes resulted in new challenges. Along the way we figured out that multiple choice became for a group of students multiple guess, so the answers were not always a fair reflection of preparation. To tackle this, we changed a part of the multiple choice questions in later quizzes to fill-in-the-blank questions where students have to add a word to a sentence. This gave a better insight in whether topics were understood or not and given this positive experience we also incorporated fill-in-the-blank questions in the final exam. Moreover, no questions were asked at all about the questions at the end of the lectures and only few used the discussion forum. While the polls and breakout groups worked for one or two tutorials, soon we noticed that students were leaving when we announced breakout groups, response rates to polls went down over time and sometimes we even questioned whether students that were logged in, were actually there. Moreover, students complained about the quizzes before the tutorials as they did not see the added value. Despite these new and probably not totally unexpected challenges, we continued to develop the idea in the year after.
Last year, when tutorials could take place on campus again, we made some changes again:
- Instead of a quiz before the tutorial, we introduced individual entry tickets. The entry ticket focused on 3-4 open questions about the course material or to practice case analysis and reviewing skills and they had to bring these to the tutorial.
- The quiz with a combination of multiple choice and was kept, but we did these during the tutorial and instead of Polleverywhere we provided printed copies of the quiz and gave students 10-15 minutes to answer the questions.
- We kept the questions at the end of the videos and the option to ask questions about these.
The fact that these tutorials were on campus, made a big difference to start with. Initially, we had an idea that students would swop their answers with a peer and give feedback on each other’s work, however in reality this did not work out well. Students preferred to have their own answers in front of them or did not bring a printed copy such that swopping was difficult. However, the open questions did sometimes result in more discussion in class. Nevertheless, the number of students involved in these discussions remained limited and as was indicated in the feedback the short check by the tutor upon entry was insufficient, because of which students felt that they could write 4 words in font size 20 and get their active participation score.
The paper based quiz, worked well. We informed students that this is not to grade them, but for them to know where they are and how much they know. If they wanted, they could check their notes or discuss with their neighbor. At this point in also became clear who read and took notes of the course material before class, because several students had notes in front of them to answer quiz questions. It was so interesting to hear some of the discussions and explanations of one student to another. For example when one student explained the difference between a merger and a joint venture to a fellow class mate saying that a merger is like two people getting married, while a joint venture is like two people having a baby.
The upcoming year, will be the third year that I am coordinating this course and again I aim to adjust this set up. In particular, I want to improve the added value of the entry ticket and take a next step in increasing active participation. Towards this end, I will experiment with Feedback Fruits. In particular, this means that:
- Students will submit their entry ticket before the tutorial in feedback fruits in which they are assigned a peer.
- During the tutorial, the questions of the entry ticket are discussed. The peer can already see the submission and provide feedback immediately in class, based on the in-class discussion. With this I aim to stimulate active participation as I expect students to see answers that are different from their own and the answers provided in class. As they now need to give reasonable feedback, the expectation (or hope) is that more students are willing to ask the questions that arise from this and in that way facilitate discussion and active participation.
- The peer review might already reduce the likelihood that students submit an entry ticket with four words and font size 10, or feedback for that matter, but to further reduce this risk I hope that a student assistant will help out by checking the entry tickets and feedback.
With baby steps and trial and error, I am slowly getting to a point that active participation also in a larger course and tutorials is more than just being present and easier to grade. I am sure that the journey will continue, based on changes in other aspects of the course (like size of tutorial groups), experience from tutors and maybe responses from readers of this blog.
Last modified: | 23 September 2022 12.07 p.m. |