Skip to main content

Good experiences with an audience response system used in medical education

Jacob Vad Jensen1, Doris Østergaard2 & Anne-Kathrine Hove Faxholt3,

1. nov. 2011
15 min.

Faktaboks

Fakta

Audience response systems (ARS), also known as "clickers", are used to heighten participants’ active involvement in educational activities such as lectures [1-7]. Each participant is provided with a personal hand-held voting unit and can answer questions with a wireless "click". Questions are presented as part of a PowerPoint presentation and data from the audience are collected by a central unit [3, 4]. The hypothesis is that ARS creates an interactive learning environment that heightens attention and thereby improves the learning opportunity [1, 2, 4-14]. A recent overview [4] indicates that ARS can increase participants’ activity and attention. The voting system can generate results immediately after the answer is given. Hence, both participants and teachers are given useful feedback during that may be used during as well as after the session. ARS has a positive effect on the short-term memory, while any effect on long-term memory is insufficiently documented [4, 12, 13, 15-18].

ARS is often used internationally, but – to the best of our knowledge – it has not yet been implemented in Danish training programs. Knowledge of the technical and pedagogical challenges is limited [17-20]. Firstly, the purpose of this paper was to evaluate the use of the ARS voting system’s possibilities as a tool 1) for course evaluation 2) for testing the learner’s knowledge, 3) for increasing activity and attention and 4) for stimulating discussion. Secondly, the purpose was to evaluate the technical and pedagogical challenges and to provide practice recommendations for faculty.

MATERIAL AND METHODS

In an ARS session, a computer connected to a projector and a signal receiver is used. All participants are supplied with a voting unit which costs about 40 euros. The ARS programme collects votes during the lecture and the software automatically saves all data. Voting questions are designed in PowerPoint using the Turning Point software. The voting function may be adjusted according to the aim of the questions. Voting answers may be displayed immediately as a graph or kept hidden from the participants.

At the Danish Institute for Medical Simulation several courses for different types of learners are conducted. Four different courses were selected to evaluate the voting system’s possibilities. ARS was presented to course directors and teachers who agreed to use the ARS for one of their specified purposes. All questions were developed, pilot-tested for understanding and adjusted as needed by the course directors and the teachers in collaboration with the research group. Only few of the teachers had previous experience with ARS. All teachers were introduced to its use via written introductions and/or practical training. A brief manual and a checklist for the set up were developed.

The first author of the paper participated in the first courses and was able to assist and to provide support. Notes were taken during these sessions describing the difficulties observed. Feedback from teachers was obtained by the use of questionnaires or semi-structured interviews which were conducted by the first author of the present paper. The interviews were taped and transcribed.

1. Audience response system used for course evaluation

ARS was introduced as a tool for course evaluation at courses in resuscitation for medical students. The final presentation contained 16 questions that were related to the contents of the course and four questions about the use of ARS. All participants had the possibility of providing written comments. A questionnaire was developed to obtain feedback from teachers.

2. Audience response system used to evaluate knowledge and to make a summary

ARS was used at the end of sessions on courses directed at medical emergency teams. A knowledge test and a summary presentation consisting of nine multiple-choice questions were conceived. The first question was about the profession (doctor or nurse) and was included to introduce the participants to ARS. Also, the initial question allowed the answers to the remaining questions to be related to the respondents’ professions. When all votes were collected, the correct answer was indicated using a "smiley" and thereby the participants received immediate test feedback. Interviews were conducted with teachers after the courses.

3. Audience response system as a discussion incentive

ARS was used to initiate discussions at a communication course in the education programme for anaesthesiologists. Cases describing the critically ill intensive care patient, situations with end-of-life decisions and the preparation of/discussions with relatives were included. In this context, ARS was to stimulate good discussions. A total of 13 ARS questions were developed with a view to illustrating the attitude of course participants and presented at the beginning and at the end of the case session. After the course the teacher provided written comments about ARS. Subsequently, an interview was conducted.

4. Audience response system as a tool to raise activity and attention

ARS was evaluated on an international one-day course in patient safety that comprised lectures given to anaesthesiologists. Initially, questions referring to the participants’ knowledge were asked. During the lectures, questions addressed participants’ attitudes towards patient safety and patient safety culture. The answers were immediately presented to the audience and commented on by the teacher who could adjust subsequent presentations accordingly. At the end of the session, course participants rated the usefulness of the system. An interview was conducted with the Danish teacher.

Danish law exempts this type of research from ethical board approval. The voting result had no influence or negative consequence for any participant.

Data analysis

Our data included voting results from all questions asked during sessions as well as evaluation questions regarding the use of ARS. The teachers’ experience with ARS was collected using either questionnaires or semi-structured interviews. The notes describing the difficulties observed in the first courses were also included. Based on these two sources of information, data were divided into comments about personal, technological and pedagogical challenges.

RESULTS

ARS was used on 33 courses by 215 participants and evaluated by 12 teachers. Table 1 shows the number of courses, course participants and teachers, the number of questions and how data were collected. Table 2 shows the results of the participants´ evaluation of the system. A summary of the personal, technical and pedagogical challenges and positive aspects is provided in Table 3.

We planned to use ARS for evaluation in 20 resuscitation courses. Due to technical problems during the session (n = 4) or in the saving procedure (n = 2), data from six courses were unavailable. The participants found that the "clickers" were easy to use and that questions were understandable. More than 90% found ARS suitable for course evaluation. Only 11% would have liked to give further written comments. All the teachers found that ARS was a useful tool for evaluation and viewed the possibility of future use of ARS positively. Some teachers found that the setup of ARS was technically challenging and would have preferred better written instructions. More than half of the teachers found it useful to obtain the results immediately and thereby receive feedback on their teaching.

ARS was used for knowledge testing and to summarize at the end of six emergency team courses, but data was saved incorrectly in one of these. Approximately 95% of the participants found that the questions were easy to understand. All appreciated that the right answer was indicated and they found that ARS was suitable for a short knowledge test at the end of the course. One interview was conducted with the two teachers, who both found ARS for post-testing interesting. However, they indicated that some of the questions should be improved.

The ARS system was used as a tool to initiate discussion and 67% of the participants found the questions to be a good starting point for discussion about communication and ethics. In small groups residents ascribed less importance to anonymous than to non-anonymous voting. Overall, 86% of the participants agreed that ARS was a suitable tool in discussions. The teachers found that questions should have been more elaborate and it was difficult to start using ARS for discussions.

At the course where the ARS system was used as an instrument to increase activity and attention, more than 90% of the participants found that the system increased the level of concentration and interactivity. Around 65% indicated that it was important to be able to answer the questions anonymously. The teachers found ARS very useful in engaging participants at the beginning of the course. The voting system was helpful in finding out about the participants’ attitude towards the topic. The teachers, however, found it challenging to comment answers immediately after their on-screen presentation. Furthermore, they also found it difficult to adjust their teaching content accordingly.

DISCUSSION

Overall, our experiences with ARS as a tool for evaluating a course, for testing knowledge or increasing attention were positive. Participants found the system very useful and the teachers described the system as stimulating, but also challenging.

Our overall experience is that ARS is robust, especially when used for evaluation. Most of our data are related to this function. The experiences with ARS as a tool for evaluating knowledge and as an incentive to stimulate discussion and increase attention are based on a small sample as the number of teachers and participants in our study was limited. Hence, we may have overlooked some important challenges or positive experiences.

The results indicate that ARS is a suitable tool for electronic course evaluation. Overall it is stable and easy to use and administration costs are low compared with written evaluations. Some of the teachers, however, experienced technical problems and data were lost. In the software version we used a data saving procedure with two almost similar icons. Respondents could therefore have misunderstood which icon actually saved their data correctly. Our results indicate a need for more practical training in the use of ARS.

Teachers can receive useful feedback immediately after the session, which makes it possible to discuss any need for changes. We speculated whether the missing opportunity for written comments was important for participants. The participants therefore had the possibility of writing comments on paper. Only a few used this opportunity. This may be due to the fact that this course had been running for more than a year and hence had improved over time. There might be other situations in which the ability to provide written comments would be appreciated by both participants and teachers, e.g. new courses, where contents were more related to attitude or by the introduction of new educational methods exposing the participants more, such as simulation.

The use of ARS for knowledge testing was positively evaluated both by participants and teachers. In our study, we included a few evaluation slides after the post test. We recommend keeping the total number of questions at a reasonable number as a large number of questions may influence the motivation to answer. Anonymous voting involves no risk for participants and they may therefore choose to vote without reflecting on the questions posed if they feel that the questions take too long to answer.

When ARS is more integrated in the course and used as a discussion stimulation tool, its application is more challenging. About a third of the participants did not find that the questions were a good starting point for discussion about communication and ethics although they found that ARS was a useful tool to start a discussion in general. One might speculate whether the prepared questions were insufficiently tested and thus supported learning objectives inadequately. When ARS is used as a discussion stimulation tool, the teacher must use participants’ answers as a starting point for the discussion. The teacher needs to be prepared to use the participants’ responses immediately after the results are shown on screen. Likewise, the design of relevant questions is essential and the objective of the questions as well as the context in which they appear must be well-planned. Our findings are in agreement with previous findings [3, 6, 7, 18-20]. We experienced that the teacher found it challenging that the sequence of the questions was pre-defined and therefore could not be changed spontaneously.

Overall, presentations are vulnerable to technical problems and a technical error can stop a voting session. The presentation should be pre-tested to minimize errors. It is essential to prepare the teacher in the use of ARS to achieve the full pedagogical benefit. It may be necessary to strengthen the teacher’s competences, offer technical assistance and instructions for use.

The introduction of a voting tool does not automatically entail an improvement in the quality of teaching. The teaching method should be in focus and careful preparation and educational planning is necessary [1, 4, 15-20]. In order to use the system optimally, it is important for the teacher to anticipate how to respond to any given voting result. The opportunity for immediate feedback from all course participants is unique [1, 3, 18-20]. However, teachers find it challenging to immediately comment on voting results and then relate them to learning objectives. Our study indicated that thorough preparation and experience with ARS in various contexts facilitate the teacher’s optimal use of the voting tool. This finding is consistent with findings reported by other studies [1, 2, 4, 9, 12, 14-18].

We experienced that using a voting system comprises a combined technical and pedagogical challenge. In order to ensure teaching quality, an implementation is necessary. Based on our initial experiences, a strategy for implementation of the ARS system in our institution was developed. Table 4 presents the main recommendations.

Overall, we find that ARS is a valuable technology with predominantly positive elements. The technology holds the potential to support learning, but teachers need to be technically and pedagogically well prepared to use the tool.

Correspondence: Jacob Vad Jensen , The Danish Institute for Medical Simulation, Herlev Hospital, Herlev Ringvej 75, 2730 Herlev, Denmark. E-mail: Jacobvad@gmail.com

Accepted: 22 August 2011

Conflicts of interest:Disclosure forms provided by the authors are available with the full text of this article at danmedbul.dk

Acknowledgement: We thank Anne Lippert, Danish Institute for Medical Simulation, for review.

Referencer

REFERENCES

  1. Draper S, Brown M. Increasing interactivity in lectures using an electronic voting system. J Comp Assist Learn 2004;20:81-94.

  2. Duggan PM, Palmer E, Devitt P. Electronic voting to encourage interactive lectures: a randomised trial. BMC Med Educ 2007;7:25.

  3. Banks DA. Audience response system in higher education: applications and cases. University of South Australia, Australia, Idea Groups Inc., 2006

  4. Jensen JV, Østergaard D. Audience response-systemer styrker læringsmiljøet. Ugeskr Læger 2010; Ugeskr Laeger 2010, August 16 (epub ahead of print). www.ugeskriftet.dk/portal/page/portal/LAEGERDK/ UGESKRIFT_FOR_LAEGER/Foerst_paa_nettet/VP11090512.pdf (26 Aug 2011).

  5. Palmer EJ, Devitt PG, Young NJ et al. Assessment of an electronic voting system within the tutorial setting: A randomised controlled trial. BMC Med Educ 2005;5:24.

  6. Uhari M, RenkoM, Soini H. Experiences of using an interactive audience response system in lectures. BMC Med Educ 2003;17:12.

  7. Stuart SAJ, Brown M, Draper S. Using an electronic voting system in logic lectures: one practitioner’s application. J Comp Assist Learn 2004;20:95-102.

  8. Jacobs DG, Sarafin JL, Huynh T et al. Audience response system technology improves accuracy and reliability of trauma outcome judgments. J Trauma 2006;61:135-41.

  9. Nayak L, Erinjeri JP. Audience response systems in medical student education benefit learners and presenters. Acad Radiol 2008;15:383-9.

  10. Copeland HL, Hewson MG, Stoller JK et al. Making the continuing medical education lecture effective. J Cont Educ Health Prof 1998;18:227-34.

  11. Slain D, Abate M, Hodges BM et al. An interactive response system to promote active learning in the Doctor of Pharmacy curriculum. Am J Pharm Educ 2004;68:117.

  12. Cain J, Black EP and Rohr J. An audience response system strategy to improve student motivation, attention and feedback. Am J Pharm Educ 2009;73:21.

  13. Kennedy GE, Cutts QI. The association between students’ use of an electronic voting system and their learning outcomes. J Comp Assist Learn 2005;21:260-8.

  14. Crouch CH, Mazur E. Peer instruction: ten years of experience and results. Am J Phys 2001;69:970-7.

  15. Doucet M, Vrins A, Harvey D. Effect of using an audience response system on learning environment, motivation and long-term retention, during case-discussions in a large group of undergraduate veterinary clinical pharmacology students. Med Teach 2009;31:570-9.

  16. Elashvili A, Denehy GE, Dawson DV et al. Evaluation of an audience response system in a preclinical operative dentistry course. J Dent Educ 2008;72:1296-303.

  17. Rubio EI, Bassignani MJ, White MA et al. Effect of an audience response system on resident learning and retention of lecture material. Am J Roentgenol 2008;190:319-22.

  18. Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ 2007;6:9-20.

  19. Robertson LJ. Twelve tips for using a computerised interactive audience response system. Med Teach 2000;22:3.

  20. Premkumar K, Coupal C. Rules of engagement – 12 tips for successful use of "clickers" in the classroom. Med Teach 2008;30:146-9.