Teaching Practices for the Student Response System at National Taiwan University

Teaching Practices for the Student Response System at National Taiwan University

Jennifer, Wen-Shya Lee1 and Mei-Lun Shih1,*

1Center for Teaching and Learning Development, Center for General Education, National Taiwan University, Taiwan

(Received 14 February 2015; Accepted 11 May 2015; Published on line 1 September 2015)
*Corresponding author: mshih@ntu.edu.tw
DOI: 10.5875/ausmt.v5i3.862

Abstract: Student response systems (SRSs) have been proven useful for enhancing student engagement and improving learning outcomes. Although many previous studies have found that both instructors and students generally hold positive attitudes toward technology which can increase classroom interaction, continuously maintaining student attention and interest remains a key challenge. The Center for Teaching and Learning Development at National Taiwan University has proactively promoted the use of an SRS since 2011. Two major approaches have been successfully implemented to assist professors in applying the SRS in their classrooms: the adoption of the Zuvio multimedia online interactive system and the implementation of a faculty SRS development group. Based on qualitative data gathered from the faculty SRS development group, this study elucidates four crucial teaching practices of the SRS: (1) designing pre- and post-instruction content comprehension assessments, (2) ensuring participatory learning through guided classroom discussions, (3) combining theory and practice, and (4) implementing group report evaluation participation.

Keywords: Zuvio, student response system (SRS), teaching practice

Introduction

In the past decade, research on teaching and learning in higher education has revealed great interest in the use of interactive technologies to enhance classroom engagement for “digital-native” students. Student engagement is affected by many variables but is seen as a key predictor of successful learning outcomes [1], [2]. Student response systems (SRSs) have been widely adopted in classrooms because they are positively associated with increased student engagement [3], [4]. An SRS is commonly known as a personal or audience response system, electronic voting system, or clicker. Instructors can use an SRS to engage students in class through a cycle of interactive activities. These activities pose questions to students in various formats (e.g., multiple choice, true/false, composite, etc.), immediately collect student responses and display the results on projection screens within the classroom, thus facilitating in-class peer discussion or feedback [5].

The relevant literature describes SRS implementations for physics, health science, medicine, nursing, science, and chemistry [1], [6], [7], [8] and these studies and others provide a comprehensive overview of the potential benefits and possible challenges of using an SRS. One frequently reported advantage is the ability of an SRS to simulate interactions between students and the instructor in the classroom, thus increasing student willingness to engage in the class [1], [7]. Compared with other interactive methods (e.g., hand-raising and flashcards) that are generally used in large classrooms, most students tend to have a positive preference toward SRSs because they can effectively reduce the time spent for collecting individual responses in the classroom and offer the possibility of anonymity [9], [10], [11]. In addition, through posing various questions to students in the classroom, SRSs provide instructors with a clearer understanding of student learning issues including misunderstandings and inability to keep pace, allowing instructors to reexamine and adjust their course content and pace as necessary [3][10][12]. Although many studies have found that both instructors and students generally hold positive attitudes toward SRSs [12][13][14], several studies have suggested that some of the benefits could merely result from the novelty value of the new technology or a change in classroom pedagogy [5][6][10].

According to Wieman and Perkins [15], an SRS can deeply and positively impact student learning behavior only if the classroom dynamic, question design, and follow-up activities are guided by instructors who understand how students learn. Thus, instructors seeking to maximize the benefits of SRS use must also have strong pedagogical and technological knowledge and skills [6][14].

This paper describes our successful experiences of assisting instructors from different disciplines in using the National Taiwan University (NTU) SRS. Suggestions for SRS implementation and teaching strategies are generated from qualitative data gathered over the past 3 years to provide instructors with advice and guidance for the future use of classroom-based SRSs.

SRS Implementation at NTU

Since 2011, NTU has proactively promoted the use of an SRS to increase student engagement and maintain learning attention in classrooms. To assist professors in effectively integrating the SRS into their instruction, two major approaches were applied at NTU: (1) the adoption of the Zuvio multimedia online interactive system (Zuvio), and (2) the operation of an SRS faculty development group with instructors who are interested in using the SRS.

Adoption of Zuvio

Since 2011, two types of SRSs have been used at NTU. The first is the instant response system (IRS), an early example of a clicker that allows students to interact with instructors through a small hand-held transmitter about the size of a television remote control. The second type of SRS is Zuvio, an online interactive system developed by NTU electrical engineering alumni in 2012.

As shown in Table I, Zuvio is easier to implement and provides a wider range of question formats. Zuvio is a cloud-based system, meaning students can interact with the instructor and peers through any type of Internet-enabled hand-held device such as a smartphone or tablet computer. This enhances the efficiency of managing and maintaining the response devices before and after class. Moreover, with multiple question formats (including open-ended questions, composite question sets, and peer evaluation questions) instructors can use Zuvio to design in-class learning activities that require conscientious responses on the part of the students. Students can not only choose from preset answer pools but also post freestyle text responses to reflect personal thinking. For example, open-ended questions and composite questions can guide students to reflect on their own learning. In addition, Zuvio’s group Q&A display functions can assist students to engage in in-depth peer discussions and debates. This in turn helps to maintain student attention and interest throughout the semester, even after the initial excitement about using a new technology fades.

Over the course of 2014, Zuvio usage on the NTU campus increased from 61 to 263 instructors, 68 to 384 courses, and 2,037 to 11,172 students. Although the majority (54%) of questions deployed in Zuvio were multiple choice, many instructors also used open-ended questions (20%) and composite questions (21%) to promote deeper engagement and reflection.

SRS faculty development group

To help promote and guide NTU instructor interest in the SRS, in 2012 the Center for Teaching and Learning Development formed the NTU SRS Faculty Development Group, which has attracted participation from approximately 40 professors from the colleges of Liberal Arts, Engineering, Medicine, Science, and Social Sciences. In helping instructors adopt and implement SRS in the classroom, this faculty development group serves as both a peer-support group, and a peer-learning group. Previous studies [14] have suggested that providing instructors with appropriate pedagogical and technological training and instruction is promoting the successful SRS adoption.

During each semester, the group meets monthly to discuss and share teaching experiences and strategies for using SRSs in various disciplines. Initial operational training and on-site technical support are provided when instructors begin using the SRS. To gain better insight into the challenges that instructors face in using Zuvio in their classrooms, system developers were invited to attend group meetings at the end of both semesters, providing instructors with opportunities to raise suggestions and recommendations for future system improvements.

Because most SRS functions are easy to learn, group meetings typically focus on sharing and discussing teaching strategies related to SRS usage. Videos (10–20 min) of experienced instructors (from NTU and other universities) using SRSs in their classes are used as peer demonstration materials and to introduce issues for group discussion. Instructors can then identify different SRS applications in various contexts and reflect on how to incorporate the system in their own teaching. The provision of this type of training has been found to have a significant impact on SRS usage effectiveness [14].

Effective SRS Teaching Practices

Teaching development for instructors using SRS assessment and feedback technologies positively affect student perception of SRSs and improve student engagement [14][16][17]. This finding is consistent with practical observations emerging from SRS-related discussions in the NTU SRS Faculty Development Group over the past 3 years.

To understand how instructors use the SRS to effectively increase student participation in classrooms, we collected data from the SRS Faculty Development Group including:

  1. Twelve PowerPoint files created by experienced professors who shared SRS practices they used to successfully increase student learning attention and participation.
  2. Eight 10–20 minute videos showing how instructors interacted with students through the SRS tools.
  3. Minutes and discussion notes for twenty group meetings related to the strategic use of SRS tools in teaching.

Qualitative thematic analysis [18] elucidated several teaching practices which contributed to the effective use of SRS applications in classrooms, suggesting that teaching strategies must be effectively integrated into the adoption of SRSs to enhance student engagement and improve learning performance.

Practice 1: Pre- and post assessment for content comprehension

Instructors face challenges in ensuring that students have the requisite prior knowledge at the beginning of the course and that students fully comprehend all the presented information. Many studies have shown that regularly using SRS tools in teaching can provide instant and effective assessment feedback in student learning [6]. However, formative feedback is more effective than summative feedback in promoting student learning and positive attitudes toward the use of an SRS in classrooms [1][8][10][14].

The first strategic teaching approach is to design ungraded pre- and post-test SRS questions. At the beginning of class, the SRS should be used to pose several conceptual questions, thus allowing instructors to gauge students’ prior knowledge of relevant course content and the level of learner comprehension. Based on the results of the pre-test, instructors could then adjust the teaching content accordingly. This also allows instructors to determine whether students have fully absorbed the main points from previous class sessions, and whether additional clarifications or review are required before embarking on new concepts. Moreover, these questions prompt students to reflect on their understanding of the course material, leading them to focus more on aspects that they do not completely comprehend. At the optimal time, instructors can conduct a post-instruction assessment using the SRS. A comparison of the pre- and post assessment results would provide an indication of learning progress and outcomes.

Practice 2: Participatory learning through guided classroom discussions

In most classrooms, instructors try to maintain a balance between lectures and activities to provide students with opportunities for hands-on practice or direct involvement with the subject material. In addition, Taiwanese university students are generally reluctant to express opinions or ask questions publicly. Thus, SRS technologies and their corresponding instruction techniques could be used to build an engaging classroom environment that encourages interaction.

The SRS tallies overall student response. When this response diverges from the instructor’s expectations it could be an indication of conceptual confusion among students. In addition to using the SRS randomized selector to choose particular students to explain their views, instructors could encourage students to explicate course content through small-group discussions on key questions. The integration of an SRS can help students clarify crucial concepts through cognitive processes including conjecture, speculation and refutation. After the discussion, the same questions can be posed again to the class in general before the instructor draws conclusions. This method for guiding classroom discussions is particularly well-suited to questions without standard answers or debatable open-ended issues. For instance, the instructor could use the SRS’s anonymity function to obtain a general sense of student opinions on controversial issues. Statistics from the class as a whole could be displayed, and a few representative responses could be chosen for further group or class discussions. In this way, students can completely express their opinions without feeling subject to pressure to conform or fear of ridicule. In addition, students can compare their own opinions to the general sentiment and reflect on the reasons for any divergence.

Practice 3: Combining theory and practice

Textbooks present theories and concepts which are conclusions drawn from copious scholarly exploration and research. If instructors begin their courses by directly discussing final processes and results, students may have trouble generating the level of interest required for deeper engagement. However, using the polling functions in the SRS, instructors can ask students to predict the outcomes of certain studies and then compare the prediction results to similar research questions. This strengthens students’ understanding of theoretical concepts. For instance, to observe how questionnaire design influences respondent attitudes, the instructor could use the SRS to survey student satisfaction with the job performance of Taiwan’s president. By responding to related questions and monitoring satisfaction level statistics from the class as a whole, students can see that the results of the experiment correspond exactly to the theoretical claims outlined in their textbooks.

Practice 4: Group report evaluation participation

In designing courses, many instructors use group presentations to help students develop teamwork and oral expression skills. However, evaluation is often conducted by the instructor alone; students may feel a lack of involvement and thus “tune out” their peers’ presentations. Feedback from the NTU Faculty Development Group suggested that this issue can be easily resolved through SRS use. In this scenario, instructors use Zuvio to allow students to grade peer presentations and display the grades on-screen in real time, providing each presenter with grades and written feedback on the quality and clarity of his/her presentation. Many studies have claimed that students like to know how well they are doing relative to their peers [7]. Because constructive feedback can be offered instantly during group project presentations, this peer evaluation feature is very popular among NTU instructors using Zuvio, particularly in the College of Liberal Arts and the College of Medicine.

Conclusion

SRSs have attracted considerable attention from instructors in various disciplines. Although many studies have found that most instructors and students hold a positive attitude toward SRS use in their classrooms, optimizing the impact of SRS use on learning outcomes requires careful consideration of system implementation and teaching strategies [6].

This paper reports our experiences at National Taiwan University (NTU) in assisting instructors across different fields and with different technology-usage backgrounds to effectively integrate an SRS into their instruction. In the implementation stage, we suggest two approaches to enhancing SRS usage in classrooms. First, a cloud-based SRS, such as Zuvio, can provide instructors with more flexibility in terms of instructional design. The user-friendly features can reduce technical requirements and improve system usage satisfaction among both instructors and students. Second, we propose organizing an SRS faculty development group to assist and motivate instructors in using an SRS. This type of faculty development group is highly helpful for providing both technical and pedagogical support to instructors when they adopt a new technology, such as an SRS, in their classrooms.

Based on the successful experiences of NTU instructors with the SRS, we identified four crucial SRS teaching practices. First, conducting pre- and post-assessments with an SRS can provide both instructors and students with the opportunity to respectively reflect on their teaching and learning. Second, the SRS can be used to guide student classroom discussions to achieve participatory learning. Third, SRS activities should be used as a bridge to combine theory and practice, thus improving student understanding of theoretical concepts. Finally, students should be allowed to contribute to group report evaluations through the SRS, thus effectively increasing students’ feelings of engagement with the course assessment.

In conclusion, an SRS is more than a simple attendance-taking or examination/assessment tool. The main objective of SRS use in higher education is to enliven the interactive atmosphere between students and instructors, to promote proactive learning, and to maintain student concentration. An SRS is a tool that can immediately increase student response rates. However, completely exploiting an SRS to obtain improved learning outcomes requires appropriate pedagogical and technical training for instructors. In the initial phases, instructors may need to devote more time and energy to familiarizing themselves with system operation or to restructuring course materials. However, after a period of use, they should be able to develop increasingly customized strategies and methods similar to those presented in this paper. This in turn will help students focus more on course content and eventually produce positive educational results.

References

  1. J. E. Caldwell, "Clickers in the large classroom: current research and best-practice tips", CBE-Life Sciences Education, vol. 6, no. 1, pp. 9-20, 2007.
    doi: 10.1187/cbe.06-12-0205
  2. K. M. Christopherson, "Hardware or wetware: what are the possible interactions of pedagogy and technology in the classroom?" Teaching of Psychology, vol. 38, pp. 288-292, 2011.
    doi: 10.1177/0098628311421332
  3. R. E. Landrum, "The ubiquitous clicker: sotl applications for scientist-educators", Teaching of Psychology, vol. 40, no. 2, pp. 98-103, 2013.
    doi: 10.1177/0098628312475028
  4. J. R. Stowell and J. M. Nelson, "Benefits of electronic audience response systems on student participation, learning, and emotion", Teaching of Psychology, vol. 34, pp. 253-258, 2007.
    doi: 10.1080/00986280701700391
  5. K. J. Denker, "Student response systems and facilitating the large lecture basic communication course: assessing engagement and learning", Communication Teacher, vol. 27, no. 1, pp. 50-69, 2013.
    doi: 10.1080/17404622.2012.730622
  6. J. H. Han, "Closing the missing links and opening the relationships among the factors: a literature review on the use of clicker technology using the 3p model", Educational Technology & Society, vol. 17, no. 4, pp. 150-168, 2014.
  7. R. H. Kay and A. LeSage, "Examining the benefits and challenges of using audience response systems: a review of the literature", Computers & Education, vol. 53, no. 3, pp. 819-827, 2009.
    doi: 10.1016/j.compedu.2009.05.001
  8. J. MacArthur and L. L. Jones, "A review of literature reports of clickers applicable to college chemistry classrooms", Chemistry Education Research and Practice, vol. 9, pp. 187-195, 2008.
    doi: 10.1039/B812407H
  9. M. Fallon and S. L. Forrest, "High-tech versus low-tech instructional strategies: a comparison of clickers and handheld response cards", Teaching of Psychology, vol. 38, pp. 194-198, 2011.
    doi: 10.1177/0098628311411896
  10. V. Simpson and M. Oliver, "Electronic voting systems for lectures then and now: a comparison of research and practice", Australasian Journal of Educational Technology, vol. 23, no. 2, pp. 187-208, 2007.
  11. C. Wieman and K. Perkins, "Transforming physics education", Physics Today, vol. 58, no. 11, pp. 36-41, 2005.
    doi: 10.1063/1.2155756
  12. R. M. Carini, G. D. Kuh, and S. P. Klein, "Student engagement and student learning: testing the linkages" Research in higher education, vol. 47, no. 1, pp. 1-32, 2006.
    doi: 10.1007/s11162-005-8150-9
  13. M. Barber and D. Njus, "Clicker evolution: seeking intelligent design", CBE-Life Sciences Education, vol. 6, no. 1, pp. 1-8, 2007.
    doi: 10.1187/cbe.06-12-0206
  14. J. H. Han and A. Finkelstein, "Understanding the effects of professors’ pedagogical development with clicker assessment and feedback technologies and the impact on students’ engagement and learning in higher education", Computers & Education, vol. 65, pp. 64-76, 2013
    doi: 10.1016/j.compedu.2013.02.002
  15. C. Wieman, "Why not try a scientific approaches to science education?" In Taking stock: research on teaching and learning in higher education,J. C. Hughes and J. Mighty, Eds. Montreal, QC, Canada: McGill-Queen’s University Press, 2010, pp. 175-190.
  16. D. Bruff. "Teaching with classroom response systems: creating active learning evnironments". San Francisco, CA: Jossey-Bass, 2009.
  17. J. D. Elicker and N. L. McConnell, "Interactive learning in the classroom: is student response method related to performance?" Teaching of Psychology, vol. 38, pp. 147-150, 2011.
    doi: 10.1177/0098628311411789
  18. V. Braun and V. Clarke, "Using thematic analysis in psychology", Qualitative Research in Psychology, vol. 3, no. 2, pp. 77-101, 2006.
    doi: 10.1191/1478088706qp063oa

Refbacks

  • There are currently no refbacks.


Copyright © 2011-2018 AUSMT ISSN: 2223-9766